19665 1727204148.63755: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-G1p executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 19665 1727204148.64162: Added group all to inventory 19665 1727204148.64166: Added group ungrouped to inventory 19665 1727204148.64170: Group all now contains ungrouped 19665 1727204148.64173: Examining possible inventory source: /tmp/network-M6W/inventory-5vW.yml 19665 1727204148.83585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 19665 1727204148.83683: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 19665 1727204148.83708: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 19665 1727204148.83782: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 19665 1727204148.83885: Loaded config def from plugin (inventory/script) 19665 1727204148.83889: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 19665 1727204148.83939: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 19665 1727204148.84054: Loaded config def from plugin (inventory/yaml) 19665 1727204148.84057: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 19665 1727204148.84178: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 19665 1727204148.84727: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 19665 1727204148.84731: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 19665 1727204148.84735: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 19665 1727204148.84745: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 19665 1727204148.84751: Loading data from /tmp/network-M6W/inventory-5vW.yml 19665 1727204148.84842: /tmp/network-M6W/inventory-5vW.yml was not parsable by auto 19665 1727204148.84939: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 19665 1727204148.84985: Loading data from /tmp/network-M6W/inventory-5vW.yml 19665 1727204148.85092: group all already in inventory 19665 1727204148.85106: set inventory_file for managed-node1 19665 1727204148.85113: set inventory_dir for managed-node1 19665 1727204148.85114: Added host managed-node1 to inventory 19665 1727204148.85116: Added host managed-node1 to group all 19665 1727204148.85119: set ansible_host for managed-node1 19665 1727204148.85120: set ansible_ssh_extra_args for managed-node1 19665 1727204148.85124: set inventory_file for managed-node2 19665 1727204148.85137: set inventory_dir for managed-node2 19665 1727204148.85138: Added host managed-node2 to inventory 19665 1727204148.85140: Added host managed-node2 to group all 19665 1727204148.85141: set ansible_host for managed-node2 19665 1727204148.85142: set ansible_ssh_extra_args for managed-node2 19665 1727204148.85146: set inventory_file for managed-node3 19665 1727204148.85149: set inventory_dir for managed-node3 19665 1727204148.85149: Added host managed-node3 to inventory 19665 1727204148.85151: Added host managed-node3 to group all 19665 1727204148.85152: set ansible_host for managed-node3 19665 1727204148.85152: set ansible_ssh_extra_args for managed-node3 19665 1727204148.85155: Reconcile groups and hosts in inventory. 19665 1727204148.85159: Group ungrouped now contains managed-node1 19665 1727204148.85161: Group ungrouped now contains managed-node2 19665 1727204148.85163: Group ungrouped now contains managed-node3 19665 1727204148.85255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 19665 1727204148.85415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 19665 1727204148.85475: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 19665 1727204148.85508: Loaded config def from plugin (vars/host_group_vars) 19665 1727204148.85510: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 19665 1727204148.85520: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 19665 1727204148.85529: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 19665 1727204148.85585: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 19665 1727204148.85958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204148.86088: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 19665 1727204148.86126: Loaded config def from plugin (connection/local) 19665 1727204148.86129: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 19665 1727204148.86810: Loaded config def from plugin (connection/paramiko_ssh) 19665 1727204148.86815: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 19665 1727204148.88004: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 19665 1727204148.88058: Loaded config def from plugin (connection/psrp) 19665 1727204148.88061: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 19665 1727204148.88988: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 19665 1727204148.89034: Loaded config def from plugin (connection/ssh) 19665 1727204148.89041: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 19665 1727204148.89482: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 19665 1727204148.89531: Loaded config def from plugin (connection/winrm) 19665 1727204148.89534: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 19665 1727204148.89575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 19665 1727204148.89650: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 19665 1727204148.89742: Loaded config def from plugin (shell/cmd) 19665 1727204148.89744: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 19665 1727204148.89773: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 19665 1727204148.89845: Loaded config def from plugin (shell/powershell) 19665 1727204148.89848: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 19665 1727204148.89912: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 19665 1727204148.90117: Loaded config def from plugin (shell/sh) 19665 1727204148.90119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 19665 1727204148.90161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 19665 1727204148.90294: Loaded config def from plugin (become/runas) 19665 1727204148.90297: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 19665 1727204148.90524: Loaded config def from plugin (become/su) 19665 1727204148.90526: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 19665 1727204148.90676: Loaded config def from plugin (become/sudo) 19665 1727204148.90678: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 19665 1727204148.90712: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 19665 1727204148.91108: in VariableManager get_vars() 19665 1727204148.91140: done with get_vars() 19665 1727204148.91289: trying /usr/local/lib/python3.12/site-packages/ansible/modules 19665 1727204148.95672: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 19665 1727204148.95813: in VariableManager get_vars() 19665 1727204148.95819: done with get_vars() 19665 1727204148.95821: variable 'playbook_dir' from source: magic vars 19665 1727204148.95822: variable 'ansible_playbook_python' from source: magic vars 19665 1727204148.95823: variable 'ansible_config_file' from source: magic vars 19665 1727204148.95824: variable 'groups' from source: magic vars 19665 1727204148.95825: variable 'omit' from source: magic vars 19665 1727204148.95826: variable 'ansible_version' from source: magic vars 19665 1727204148.95826: variable 'ansible_check_mode' from source: magic vars 19665 1727204148.95827: variable 'ansible_diff_mode' from source: magic vars 19665 1727204148.95828: variable 'ansible_forks' from source: magic vars 19665 1727204148.95829: variable 'ansible_inventory_sources' from source: magic vars 19665 1727204148.95829: variable 'ansible_skip_tags' from source: magic vars 19665 1727204148.95830: variable 'ansible_limit' from source: magic vars 19665 1727204148.95831: variable 'ansible_run_tags' from source: magic vars 19665 1727204148.95831: variable 'ansible_verbosity' from source: magic vars 19665 1727204148.95880: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml 19665 1727204148.96411: in VariableManager get_vars() 19665 1727204148.96429: done with get_vars() 19665 1727204148.96469: in VariableManager get_vars() 19665 1727204148.96484: done with get_vars() 19665 1727204148.96529: in VariableManager get_vars() 19665 1727204148.96544: done with get_vars() 19665 1727204148.96625: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 19665 1727204148.96878: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 19665 1727204148.96999: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 19665 1727204148.98016: in VariableManager get_vars() 19665 1727204148.98042: done with get_vars() 19665 1727204148.98600: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 19665 1727204148.98754: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19665 1727204149.00291: in VariableManager get_vars() 19665 1727204149.00295: done with get_vars() 19665 1727204149.00297: variable 'playbook_dir' from source: magic vars 19665 1727204149.00298: variable 'ansible_playbook_python' from source: magic vars 19665 1727204149.00299: variable 'ansible_config_file' from source: magic vars 19665 1727204149.00300: variable 'groups' from source: magic vars 19665 1727204149.00301: variable 'omit' from source: magic vars 19665 1727204149.00301: variable 'ansible_version' from source: magic vars 19665 1727204149.00302: variable 'ansible_check_mode' from source: magic vars 19665 1727204149.00303: variable 'ansible_diff_mode' from source: magic vars 19665 1727204149.00303: variable 'ansible_forks' from source: magic vars 19665 1727204149.00304: variable 'ansible_inventory_sources' from source: magic vars 19665 1727204149.00305: variable 'ansible_skip_tags' from source: magic vars 19665 1727204149.00306: variable 'ansible_limit' from source: magic vars 19665 1727204149.00306: variable 'ansible_run_tags' from source: magic vars 19665 1727204149.00307: variable 'ansible_verbosity' from source: magic vars 19665 1727204149.00347: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 19665 1727204149.00461: in VariableManager get_vars() 19665 1727204149.00479: done with get_vars() 19665 1727204149.00515: in VariableManager get_vars() 19665 1727204149.00518: done with get_vars() 19665 1727204149.00520: variable 'playbook_dir' from source: magic vars 19665 1727204149.00521: variable 'ansible_playbook_python' from source: magic vars 19665 1727204149.00522: variable 'ansible_config_file' from source: magic vars 19665 1727204149.00523: variable 'groups' from source: magic vars 19665 1727204149.00523: variable 'omit' from source: magic vars 19665 1727204149.00524: variable 'ansible_version' from source: magic vars 19665 1727204149.00525: variable 'ansible_check_mode' from source: magic vars 19665 1727204149.00526: variable 'ansible_diff_mode' from source: magic vars 19665 1727204149.00527: variable 'ansible_forks' from source: magic vars 19665 1727204149.00527: variable 'ansible_inventory_sources' from source: magic vars 19665 1727204149.00528: variable 'ansible_skip_tags' from source: magic vars 19665 1727204149.00529: variable 'ansible_limit' from source: magic vars 19665 1727204149.00530: variable 'ansible_run_tags' from source: magic vars 19665 1727204149.00530: variable 'ansible_verbosity' from source: magic vars 19665 1727204149.00680: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 19665 1727204149.00751: in VariableManager get_vars() 19665 1727204149.00892: done with get_vars() 19665 1727204149.00947: in VariableManager get_vars() 19665 1727204149.00950: done with get_vars() 19665 1727204149.00952: variable 'playbook_dir' from source: magic vars 19665 1727204149.00953: variable 'ansible_playbook_python' from source: magic vars 19665 1727204149.00954: variable 'ansible_config_file' from source: magic vars 19665 1727204149.00955: variable 'groups' from source: magic vars 19665 1727204149.00956: variable 'omit' from source: magic vars 19665 1727204149.00956: variable 'ansible_version' from source: magic vars 19665 1727204149.00957: variable 'ansible_check_mode' from source: magic vars 19665 1727204149.00958: variable 'ansible_diff_mode' from source: magic vars 19665 1727204149.00959: variable 'ansible_forks' from source: magic vars 19665 1727204149.00966: variable 'ansible_inventory_sources' from source: magic vars 19665 1727204149.00967: variable 'ansible_skip_tags' from source: magic vars 19665 1727204149.00968: variable 'ansible_limit' from source: magic vars 19665 1727204149.00969: variable 'ansible_run_tags' from source: magic vars 19665 1727204149.00969: variable 'ansible_verbosity' from source: magic vars 19665 1727204149.01006: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 19665 1727204149.01083: in VariableManager get_vars() 19665 1727204149.01087: done with get_vars() 19665 1727204149.01089: variable 'playbook_dir' from source: magic vars 19665 1727204149.01090: variable 'ansible_playbook_python' from source: magic vars 19665 1727204149.01091: variable 'ansible_config_file' from source: magic vars 19665 1727204149.01091: variable 'groups' from source: magic vars 19665 1727204149.01092: variable 'omit' from source: magic vars 19665 1727204149.01093: variable 'ansible_version' from source: magic vars 19665 1727204149.01093: variable 'ansible_check_mode' from source: magic vars 19665 1727204149.01094: variable 'ansible_diff_mode' from source: magic vars 19665 1727204149.01095: variable 'ansible_forks' from source: magic vars 19665 1727204149.01096: variable 'ansible_inventory_sources' from source: magic vars 19665 1727204149.01096: variable 'ansible_skip_tags' from source: magic vars 19665 1727204149.01097: variable 'ansible_limit' from source: magic vars 19665 1727204149.01098: variable 'ansible_run_tags' from source: magic vars 19665 1727204149.01099: variable 'ansible_verbosity' from source: magic vars 19665 1727204149.01132: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 19665 1727204149.01207: in VariableManager get_vars() 19665 1727204149.01224: done with get_vars() 19665 1727204149.01277: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 19665 1727204149.01553: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 19665 1727204149.01647: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 19665 1727204149.02182: in VariableManager get_vars() 19665 1727204149.02208: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19665 1727204149.04261: in VariableManager get_vars() 19665 1727204149.04286: done with get_vars() 19665 1727204149.04326: in VariableManager get_vars() 19665 1727204149.04329: done with get_vars() 19665 1727204149.04332: variable 'playbook_dir' from source: magic vars 19665 1727204149.04333: variable 'ansible_playbook_python' from source: magic vars 19665 1727204149.04333: variable 'ansible_config_file' from source: magic vars 19665 1727204149.04334: variable 'groups' from source: magic vars 19665 1727204149.04335: variable 'omit' from source: magic vars 19665 1727204149.04338: variable 'ansible_version' from source: magic vars 19665 1727204149.04339: variable 'ansible_check_mode' from source: magic vars 19665 1727204149.04340: variable 'ansible_diff_mode' from source: magic vars 19665 1727204149.04341: variable 'ansible_forks' from source: magic vars 19665 1727204149.04341: variable 'ansible_inventory_sources' from source: magic vars 19665 1727204149.04342: variable 'ansible_skip_tags' from source: magic vars 19665 1727204149.04343: variable 'ansible_limit' from source: magic vars 19665 1727204149.04344: variable 'ansible_run_tags' from source: magic vars 19665 1727204149.04344: variable 'ansible_verbosity' from source: magic vars 19665 1727204149.04385: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 19665 1727204149.04470: in VariableManager get_vars() 19665 1727204149.04487: done with get_vars() 19665 1727204149.04538: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 19665 1727204149.04733: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 19665 1727204149.04819: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 19665 1727204149.05259: in VariableManager get_vars() 19665 1727204149.05285: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19665 1727204149.07251: in VariableManager get_vars() 19665 1727204149.07255: done with get_vars() 19665 1727204149.07257: variable 'playbook_dir' from source: magic vars 19665 1727204149.07258: variable 'ansible_playbook_python' from source: magic vars 19665 1727204149.07259: variable 'ansible_config_file' from source: magic vars 19665 1727204149.07260: variable 'groups' from source: magic vars 19665 1727204149.07261: variable 'omit' from source: magic vars 19665 1727204149.07262: variable 'ansible_version' from source: magic vars 19665 1727204149.07262: variable 'ansible_check_mode' from source: magic vars 19665 1727204149.07265: variable 'ansible_diff_mode' from source: magic vars 19665 1727204149.07266: variable 'ansible_forks' from source: magic vars 19665 1727204149.07267: variable 'ansible_inventory_sources' from source: magic vars 19665 1727204149.07268: variable 'ansible_skip_tags' from source: magic vars 19665 1727204149.07269: variable 'ansible_limit' from source: magic vars 19665 1727204149.07270: variable 'ansible_run_tags' from source: magic vars 19665 1727204149.07271: variable 'ansible_verbosity' from source: magic vars 19665 1727204149.07306: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 19665 1727204149.07462: in VariableManager get_vars() 19665 1727204149.07479: done with get_vars() 19665 1727204149.07526: in VariableManager get_vars() 19665 1727204149.07529: done with get_vars() 19665 1727204149.07534: variable 'playbook_dir' from source: magic vars 19665 1727204149.07535: variable 'ansible_playbook_python' from source: magic vars 19665 1727204149.07538: variable 'ansible_config_file' from source: magic vars 19665 1727204149.07538: variable 'groups' from source: magic vars 19665 1727204149.07539: variable 'omit' from source: magic vars 19665 1727204149.07540: variable 'ansible_version' from source: magic vars 19665 1727204149.07541: variable 'ansible_check_mode' from source: magic vars 19665 1727204149.07542: variable 'ansible_diff_mode' from source: magic vars 19665 1727204149.07542: variable 'ansible_forks' from source: magic vars 19665 1727204149.07543: variable 'ansible_inventory_sources' from source: magic vars 19665 1727204149.07544: variable 'ansible_skip_tags' from source: magic vars 19665 1727204149.07545: variable 'ansible_limit' from source: magic vars 19665 1727204149.07546: variable 'ansible_run_tags' from source: magic vars 19665 1727204149.07546: variable 'ansible_verbosity' from source: magic vars 19665 1727204149.07585: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml 19665 1727204149.07657: in VariableManager get_vars() 19665 1727204149.07673: done with get_vars() 19665 1727204149.07745: in VariableManager get_vars() 19665 1727204149.07758: done with get_vars() 19665 1727204149.07886: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 19665 1727204149.07905: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 19665 1727204149.08210: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 19665 1727204149.08401: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 19665 1727204149.08409: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 19665 1727204149.08450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 19665 1727204149.08479: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 19665 1727204149.08738: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 19665 1727204149.08809: Loaded config def from plugin (callback/default) 19665 1727204149.08812: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 19665 1727204149.12953: Loaded config def from plugin (callback/junit) 19665 1727204149.12957: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 19665 1727204149.13014: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 19665 1727204149.13093: Loaded config def from plugin (callback/minimal) 19665 1727204149.13095: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 19665 1727204149.13213: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 19665 1727204149.13288: Loaded config def from plugin (callback/tree) 19665 1727204149.13290: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 19665 1727204149.13420: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 19665 1727204149.13423: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-G1p/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bridge_nm.yml ************************************************** 11 plays in /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml 19665 1727204149.13454: in VariableManager get_vars() 19665 1727204149.13475: done with get_vars() 19665 1727204149.13480: in VariableManager get_vars() 19665 1727204149.13489: done with get_vars() 19665 1727204149.13493: variable 'omit' from source: magic vars 19665 1727204149.13533: in VariableManager get_vars() 19665 1727204149.13552: done with get_vars() 19665 1727204149.13580: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bridge.yml' with nm as provider] *********** 19665 1727204149.14247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 19665 1727204149.14332: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 19665 1727204149.14362: getting the remaining hosts for this loop 19665 1727204149.14366: done getting the remaining hosts for this loop 19665 1727204149.14369: getting the next task for host managed-node3 19665 1727204149.14373: done getting next task for host managed-node3 19665 1727204149.14374: ^ task is: TASK: Gathering Facts 19665 1727204149.14376: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204149.14379: getting variables 19665 1727204149.14380: in VariableManager get_vars() 19665 1727204149.14390: Calling all_inventory to load vars for managed-node3 19665 1727204149.14394: Calling groups_inventory to load vars for managed-node3 19665 1727204149.14396: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204149.14409: Calling all_plugins_play to load vars for managed-node3 19665 1727204149.14422: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204149.14425: Calling groups_plugins_play to load vars for managed-node3 19665 1727204149.14459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204149.14512: done with get_vars() 19665 1727204149.14519: done getting variables 19665 1727204149.14584: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 Tuesday 24 September 2024 14:55:49 -0400 (0:00:00.012) 0:00:00.012 ***** 19665 1727204149.14607: entering _queue_task() for managed-node3/gather_facts 19665 1727204149.14608: Creating lock for gather_facts 19665 1727204149.14969: worker is 1 (out of 1 available) 19665 1727204149.14989: exiting _queue_task() for managed-node3/gather_facts 19665 1727204149.15003: done queuing things up, now waiting for results queue to drain 19665 1727204149.15005: waiting for pending results... 19665 1727204149.15249: running TaskExecutor() for managed-node3/TASK: Gathering Facts 19665 1727204149.15361: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000007e 19665 1727204149.15396: variable 'ansible_search_path' from source: unknown 19665 1727204149.15435: calling self._execute() 19665 1727204149.15505: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204149.15515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204149.15528: variable 'omit' from source: magic vars 19665 1727204149.15639: variable 'omit' from source: magic vars 19665 1727204149.15703: variable 'omit' from source: magic vars 19665 1727204149.15838: variable 'omit' from source: magic vars 19665 1727204149.15890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204149.15943: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204149.15973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204149.15994: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204149.16009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204149.16052: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204149.16061: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204149.16073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204149.16180: Set connection var ansible_connection to ssh 19665 1727204149.16192: Set connection var ansible_shell_type to sh 19665 1727204149.16201: Set connection var ansible_timeout to 10 19665 1727204149.16209: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204149.16220: Set connection var ansible_shell_executable to /bin/sh 19665 1727204149.16231: Set connection var ansible_pipelining to False 19665 1727204149.16267: variable 'ansible_shell_executable' from source: unknown 19665 1727204149.16276: variable 'ansible_connection' from source: unknown 19665 1727204149.16283: variable 'ansible_module_compression' from source: unknown 19665 1727204149.16289: variable 'ansible_shell_type' from source: unknown 19665 1727204149.16294: variable 'ansible_shell_executable' from source: unknown 19665 1727204149.16301: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204149.16307: variable 'ansible_pipelining' from source: unknown 19665 1727204149.16313: variable 'ansible_timeout' from source: unknown 19665 1727204149.16319: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204149.16524: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204149.16545: variable 'omit' from source: magic vars 19665 1727204149.16554: starting attempt loop 19665 1727204149.16562: running the handler 19665 1727204149.16588: variable 'ansible_facts' from source: unknown 19665 1727204149.16610: _low_level_execute_command(): starting 19665 1727204149.16623: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204149.17404: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204149.17422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.17446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.17468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.17511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.17524: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204149.17542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.17569: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204149.17582: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204149.17594: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204149.17607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.17622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.17642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.17657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.17675: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204149.17690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.17771: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204149.17798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204149.17815: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204149.17898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204149.19557: stdout chunk (state=3): >>>/root <<< 19665 1727204149.19682: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204149.19777: stderr chunk (state=3): >>><<< 19665 1727204149.19781: stdout chunk (state=3): >>><<< 19665 1727204149.19874: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204149.19878: _low_level_execute_command(): starting 19665 1727204149.19881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204149.198032-19763-267797614457478 `" && echo ansible-tmp-1727204149.198032-19763-267797614457478="` echo /root/.ansible/tmp/ansible-tmp-1727204149.198032-19763-267797614457478 `" ) && sleep 0' 19665 1727204149.21328: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204149.21357: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.21376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.21397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.21440: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.21453: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204149.21474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.21494: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204149.21506: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204149.21518: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204149.21531: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.21545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.21561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.21575: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.21591: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204149.21605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.21682: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204149.21708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204149.21724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204149.21802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204149.23655: stdout chunk (state=3): >>>ansible-tmp-1727204149.198032-19763-267797614457478=/root/.ansible/tmp/ansible-tmp-1727204149.198032-19763-267797614457478 <<< 19665 1727204149.23873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204149.23880: stdout chunk (state=3): >>><<< 19665 1727204149.23883: stderr chunk (state=3): >>><<< 19665 1727204149.23971: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204149.198032-19763-267797614457478=/root/.ansible/tmp/ansible-tmp-1727204149.198032-19763-267797614457478 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204149.23976: variable 'ansible_module_compression' from source: unknown 19665 1727204149.24100: ANSIBALLZ: Using generic lock for ansible.legacy.setup 19665 1727204149.24108: ANSIBALLZ: Acquiring lock 19665 1727204149.24110: ANSIBALLZ: Lock acquired: 140619596462752 19665 1727204149.24112: ANSIBALLZ: Creating module 19665 1727204149.60698: ANSIBALLZ: Writing module into payload 19665 1727204149.60904: ANSIBALLZ: Writing module 19665 1727204149.60945: ANSIBALLZ: Renaming module 19665 1727204149.60955: ANSIBALLZ: Done creating module 19665 1727204149.61007: variable 'ansible_facts' from source: unknown 19665 1727204149.61023: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204149.61040: _low_level_execute_command(): starting 19665 1727204149.61051: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 19665 1727204149.61822: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204149.61839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.61858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.61886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.61930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.61945: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204149.61960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.61985: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204149.62002: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204149.62012: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204149.62025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.62041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.62057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.62081: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.62097: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204149.62115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.62203: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204149.62234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204149.62254: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204149.62348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204149.63999: stdout chunk (state=3): >>>PLATFORM <<< 19665 1727204149.64090: stdout chunk (state=3): >>>Linux <<< 19665 1727204149.64109: stdout chunk (state=3): >>>FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 19665 1727204149.64251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204149.64350: stderr chunk (state=3): >>><<< 19665 1727204149.64366: stdout chunk (state=3): >>><<< 19665 1727204149.64474: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.9 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204149.64485 [managed-node3]: found interpreters: ['/usr/bin/python3.9', '/usr/bin/python3', '/usr/bin/python3'] 19665 1727204149.64488: _low_level_execute_command(): starting 19665 1727204149.64490: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 && sleep 0' 19665 1727204149.64622: Sending initial data 19665 1727204149.64625: Sent initial data (1181 bytes) 19665 1727204149.65207: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204149.65222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.65246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.65266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.65308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.65319: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204149.65332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.65360: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204149.65373: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204149.65383: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204149.65393: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.65405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.65419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.65429: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.65441: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204149.65462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.65542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204149.65572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204149.65588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204149.65663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204149.69492: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 19665 1727204149.69893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204149.69998: stderr chunk (state=3): >>><<< 19665 1727204149.70009: stdout chunk (state=3): >>><<< 19665 1727204149.70276: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"9\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"9\"\nPLATFORM_ID=\"platform:el9\"\nPRETTY_NAME=\"CentOS Stream 9\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:9\"\nHOME_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 9\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204149.70280: variable 'ansible_facts' from source: unknown 19665 1727204149.70283: variable 'ansible_facts' from source: unknown 19665 1727204149.70285: variable 'ansible_module_compression' from source: unknown 19665 1727204149.70287: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19665 1727204149.70289: variable 'ansible_facts' from source: unknown 19665 1727204149.70406: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204149.198032-19763-267797614457478/AnsiballZ_setup.py 19665 1727204149.70588: Sending initial data 19665 1727204149.70591: Sent initial data (153 bytes) 19665 1727204149.71673: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204149.71695: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.71714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.71732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.71781: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.71794: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204149.71819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.71842: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204149.71856: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204149.71870: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204149.71884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.71899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.71924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.71943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.71956: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204149.71974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.72061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204149.72087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204149.72105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204149.72189: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204149.73971: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204149.74055: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204149.74104: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204149.74724: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpa9jmbg3d /root/.ansible/tmp/ansible-tmp-1727204149.198032-19763-267797614457478/AnsiballZ_setup.py <<< 19665 1727204149.76527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204149.76738: stderr chunk (state=3): >>><<< 19665 1727204149.76742: stdout chunk (state=3): >>><<< 19665 1727204149.76765: done transferring module to remote 19665 1727204149.76781: _low_level_execute_command(): starting 19665 1727204149.76786: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204149.198032-19763-267797614457478/ /root/.ansible/tmp/ansible-tmp-1727204149.198032-19763-267797614457478/AnsiballZ_setup.py && sleep 0' 19665 1727204149.77477: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204149.77486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.77497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.77516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.77559: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.77567: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204149.77581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.77595: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204149.77603: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204149.77610: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204149.77623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.77640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.77649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.77657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.77665: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204149.77676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.77754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204149.77778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204149.77790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204149.77866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204149.79585: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204149.79674: stderr chunk (state=3): >>><<< 19665 1727204149.79678: stdout chunk (state=3): >>><<< 19665 1727204149.79771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204149.79776: _low_level_execute_command(): starting 19665 1727204149.79779: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204149.198032-19763-267797614457478/AnsiballZ_setup.py && sleep 0' 19665 1727204149.80553: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204149.80578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.80597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.80616: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.80659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.80674: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204149.80692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.80707: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204149.80717: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204149.80725: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204149.80735: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204149.80748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204149.80762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204149.80776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204149.80786: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204149.80802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204149.80882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204149.80906: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204149.80922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204149.81001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204149.82903: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 19665 1727204149.82966: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 19665 1727204149.82995: stdout chunk (state=3): >>>import 'posix' # <<< 19665 1727204149.83026: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 19665 1727204149.83069: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 19665 1727204149.83129: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204149.83184: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 19665 1727204149.83188: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cf1edc0> <<< 19665 1727204149.83251: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 19665 1727204149.83255: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cec33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cf1eb20> <<< 19665 1727204149.83308: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cf1eac0> <<< 19665 1727204149.83319: stdout chunk (state=3): >>>import '_signal' # <<< 19665 1727204149.83332: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cec3490> <<< 19665 1727204149.83410: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 19665 1727204149.83414: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cec3940> <<< 19665 1727204149.83433: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cec3670> <<< 19665 1727204149.83455: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 19665 1727204149.83501: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 19665 1727204149.83529: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 19665 1727204149.83546: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 19665 1727204149.83567: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce7a190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 19665 1727204149.83588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 19665 1727204149.83657: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce7a220> <<< 19665 1727204149.83685: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 19665 1727204149.83729: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce9d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce7a940> <<< 19665 1727204149.83772: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cedb880> <<< 19665 1727204149.83787: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 19665 1727204149.83790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce73d90> <<< 19665 1727204149.83850: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce9dd90> <<< 19665 1727204149.83899: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cec3970> <<< 19665 1727204149.83923: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 19665 1727204149.84255: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 19665 1727204149.84311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 19665 1727204149.84315: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 19665 1727204149.84359: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 19665 1727204149.84363: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 19665 1727204149.84377: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce3ef10> <<< 19665 1727204149.84424: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce440a0> <<< 19665 1727204149.84447: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 19665 1727204149.84479: stdout chunk (state=3): >>>import '_sre' # <<< 19665 1727204149.84512: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 19665 1727204149.84542: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 19665 1727204149.84569: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce375b0> <<< 19665 1727204149.84578: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce3f6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce3e3d0> <<< 19665 1727204149.84591: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 19665 1727204149.84641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 19665 1727204149.84668: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 19665 1727204149.84699: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204149.84713: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 19665 1727204149.84761: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5cafae50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cafa940> import 'itertools' # <<< 19665 1727204149.84793: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cafaf40> <<< 19665 1727204149.84841: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 19665 1727204149.84881: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 19665 1727204149.84896: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cafad90> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0b100> import '_collections' # <<< 19665 1727204149.84949: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cbeedc0> <<< 19665 1727204149.84962: stdout chunk (state=3): >>>import '_functools' # <<< 19665 1727204149.84981: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cbe76a0> <<< 19665 1727204149.85021: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py <<< 19665 1727204149.85049: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cbfa700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce45eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 19665 1727204149.85085: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5cb0bd00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cbee2e0> <<< 19665 1727204149.85131: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5cbfa310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce4ba60> <<< 19665 1727204149.85144: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 19665 1727204149.85187: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204149.85231: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0bee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0be20> <<< 19665 1727204149.85270: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0bd90> <<< 19665 1727204149.85273: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 19665 1727204149.85306: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 19665 1727204149.85334: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 19665 1727204149.85375: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 19665 1727204149.85401: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cade400> <<< 19665 1727204149.85441: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 19665 1727204149.85475: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cade4f0> <<< 19665 1727204149.85590: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb13f70> <<< 19665 1727204149.85638: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0dac0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0d490> <<< 19665 1727204149.85671: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 19665 1727204149.85714: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 19665 1727204149.85736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 19665 1727204149.85771: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ca12250> <<< 19665 1727204149.85791: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cac9550> <<< 19665 1727204149.85849: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0df40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce4b0d0> <<< 19665 1727204149.85887: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 19665 1727204149.85915: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 19665 1727204149.85934: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ca24b80> <<< 19665 1727204149.85945: stdout chunk (state=3): >>>import 'errno' # <<< 19665 1727204149.85982: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5ca24eb0> <<< 19665 1727204149.86018: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 19665 1727204149.86036: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ca357c0> <<< 19665 1727204149.86060: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 19665 1727204149.86094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 19665 1727204149.86123: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ca35d00> <<< 19665 1727204149.86160: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9cf430> <<< 19665 1727204149.86189: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ca24fa0> <<< 19665 1727204149.86204: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 19665 1727204149.86254: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9df310> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ca35640> <<< 19665 1727204149.86268: stdout chunk (state=3): >>>import 'pwd' # <<< 19665 1727204149.86297: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9df3d0> <<< 19665 1727204149.86342: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0ba60> <<< 19665 1727204149.86363: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 19665 1727204149.86382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 19665 1727204149.86406: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 19665 1727204149.86435: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 19665 1727204149.86465: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9fb730> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 19665 1727204149.86492: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9fba00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c9fb7f0> <<< 19665 1727204149.86531: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9fb8e0> <<< 19665 1727204149.86560: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 19665 1727204149.86757: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9fbd30> <<< 19665 1727204149.86791: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5ca05280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c9fb970> <<< 19665 1727204149.86818: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c9eeac0> <<< 19665 1727204149.86836: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0b640> <<< 19665 1727204149.86870: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 19665 1727204149.86919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 19665 1727204149.86955: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c9fbb20> <<< 19665 1727204149.87098: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 19665 1727204149.87123: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9c5c3e6700> <<< 19665 1727204149.87352: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 19665 1727204149.87446: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.87484: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available <<< 19665 1727204149.87520: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.87524: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py <<< 19665 1727204149.87540: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.88744: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.89663: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c325850> <<< 19665 1727204149.89700: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204149.89730: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 19665 1727204149.89766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 19665 1727204149.89779: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c325160> <<< 19665 1727204149.89816: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c325280> <<< 19665 1727204149.89861: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c325fa0> <<< 19665 1727204149.89876: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 19665 1727204149.89929: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c3254f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c325dc0> <<< 19665 1727204149.89933: stdout chunk (state=3): >>>import 'atexit' # <<< 19665 1727204149.89965: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c325580> <<< 19665 1727204149.89979: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 19665 1727204149.90000: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 19665 1727204149.90050: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c325100> <<< 19665 1727204149.90074: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 19665 1727204149.90095: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 19665 1727204149.90116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 19665 1727204149.90142: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 19665 1727204149.90232: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c2fa0a0> <<< 19665 1727204149.90293: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c1ff370> <<< 19665 1727204149.90322: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c1ff070> <<< 19665 1727204149.90334: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 19665 1727204149.90372: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c1ffcd0> <<< 19665 1727204149.90389: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c30ddc0> <<< 19665 1727204149.90556: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c30d3a0> <<< 19665 1727204149.90571: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 19665 1727204149.90601: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c30df40> <<< 19665 1727204149.90632: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py <<< 19665 1727204149.90649: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 19665 1727204149.90668: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 19665 1727204149.90714: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 19665 1727204149.90734: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py <<< 19665 1727204149.90741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c35af40> <<< 19665 1727204149.90799: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c32cd60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c32c430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c2d8af0> <<< 19665 1727204149.90857: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c32c550> <<< 19665 1727204149.90894: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c32c580> <<< 19665 1727204149.90898: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 19665 1727204149.90922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 19665 1727204149.90925: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 19665 1727204149.90960: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 19665 1727204149.91024: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c26dfa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c36c280> <<< 19665 1727204149.91069: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 19665 1727204149.91072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 19665 1727204149.91120: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c26b820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c36c400> <<< 19665 1727204149.91148: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 19665 1727204149.91208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204149.91226: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py <<< 19665 1727204149.91229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 19665 1727204149.91281: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c36cc40> <<< 19665 1727204149.91407: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c26b7c0> <<< 19665 1727204149.91496: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c3051c0> <<< 19665 1727204149.91538: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c36c9d0> <<< 19665 1727204149.91580: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c36c550> <<< 19665 1727204149.91583: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c365940> <<< 19665 1727204149.91613: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 19665 1727204149.91639: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 19665 1727204149.91654: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 19665 1727204149.91687: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c25f910> <<< 19665 1727204149.91877: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c27ddc0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c269550> <<< 19665 1727204149.91916: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c25feb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c269970> <<< 19665 1727204149.91961: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.91965: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 19665 1727204149.91989: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.92046: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.92133: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19665 1727204149.92172: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py <<< 19665 1727204149.92176: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 19665 1727204149.92187: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.92274: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.92374: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.92823: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.93297: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py <<< 19665 1727204149.93335: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py <<< 19665 1727204149.93340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204149.93387: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c2ad7f0> <<< 19665 1727204149.93466: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' <<< 19665 1727204149.93482: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c27b760> <<< 19665 1727204149.93485: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5be0a940> <<< 19665 1727204149.93554: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 19665 1727204149.93576: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.93579: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available <<< 19665 1727204149.93693: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.93837: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py <<< 19665 1727204149.93852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 19665 1727204149.93873: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c2e3730> # zipimport: zlib available <<< 19665 1727204149.94254: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.94614: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.94671: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.94739: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 19665 1727204149.94777: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.94822: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py <<< 19665 1727204149.94825: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.94875: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.94966: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py <<< 19665 1727204149.94986: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19665 1727204149.95016: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 19665 1727204149.95020: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.95032: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.95067: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 19665 1727204149.95070: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.95260: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.95448: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 19665 1727204149.95480: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 19665 1727204149.95558: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c3282e0> # zipimport: zlib available <<< 19665 1727204149.95624: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.95712: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py <<< 19665 1727204149.95716: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 19665 1727204149.95728: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.95755: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.95801: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py <<< 19665 1727204149.95805: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.95840: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.95873: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.95971: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.96039: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 19665 1727204149.96061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204149.96135: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c29e880> <<< 19665 1727204149.96227: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bc86550> <<< 19665 1727204149.96277: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 19665 1727204149.96325: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.96392: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.96403: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.96454: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py <<< 19665 1727204149.96457: stdout chunk (state=3): >>># code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 19665 1727204149.96486: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 19665 1727204149.96512: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 19665 1727204149.96556: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py <<< 19665 1727204149.96559: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 19665 1727204149.96638: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c2ae910> <<< 19665 1727204149.96675: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c2f7970> <<< 19665 1727204149.96741: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c2e1850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 19665 1727204149.96769: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.96797: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 19665 1727204149.96883: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py <<< 19665 1727204149.96895: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py <<< 19665 1727204149.96924: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.96967: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97044: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97057: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97070: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97096: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97154: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97171: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97208: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py <<< 19665 1727204149.97211: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97281: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97341: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97365: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97403: stdout chunk (state=3): >>>import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 19665 1727204149.97547: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97691: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97725: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.97772: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204149.97807: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py <<< 19665 1727204149.97822: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' <<< 19665 1727204149.97840: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' <<< 19665 1727204149.97874: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bb8bc70> <<< 19665 1727204149.97909: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py <<< 19665 1727204149.97912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' <<< 19665 1727204149.97922: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py <<< 19665 1727204149.97976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' <<< 19665 1727204149.97992: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py <<< 19665 1727204149.97995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bdeca30> <<< 19665 1727204149.98036: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5bdec9a0> <<< 19665 1727204149.98111: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5be2eb20> <<< 19665 1727204149.98115: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5be2e550> <<< 19665 1727204149.98143: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5be1f2e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5be1f970> <<< 19665 1727204149.98176: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 19665 1727204149.98201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 19665 1727204149.98215: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 19665 1727204149.98272: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5bdd02b0> <<< 19665 1727204149.98275: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bdd0a00> <<< 19665 1727204149.98299: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 19665 1727204149.98333: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bdd0940> <<< 19665 1727204149.98362: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py <<< 19665 1727204149.98364: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 19665 1727204149.98387: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5bbec0d0> <<< 19665 1727204149.98432: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c29a3a0> <<< 19665 1727204149.98459: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5be1f670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 19665 1727204149.98482: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py <<< 19665 1727204149.98511: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 19665 1727204149.98514: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.98557: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.98611: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 19665 1727204149.98670: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.98744: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 19665 1727204149.98748: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 19665 1727204149.98790: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.98793: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.98818: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 19665 1727204149.98861: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.98903: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 19665 1727204149.98947: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.99000: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 19665 1727204149.99003: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.99050: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.99099: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.99147: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.99203: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py <<< 19665 1727204149.99222: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.99598: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204149.99969: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 19665 1727204150.00013: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.00062: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.00092: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.00143: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 19665 1727204150.00153: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.00169: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.00199: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available <<< 19665 1727204150.00251: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.00298: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py <<< 19665 1727204150.00326: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.00348: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.00372: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available <<< 19665 1727204150.00398: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.00445: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py <<< 19665 1727204150.00448: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.00496: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.00572: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 19665 1727204150.00609: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5badaeb0> <<< 19665 1727204150.00622: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 19665 1727204150.00649: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 19665 1727204150.00815: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bada9d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py <<< 19665 1727204150.00819: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.00870: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.00940: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 19665 1727204150.00943: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.01009: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.01089: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available <<< 19665 1727204150.01156: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.01233: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 19665 1727204150.01236: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.01267: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.01311: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 19665 1727204150.01335: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 19665 1727204150.01483: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5bb47bb0> <<< 19665 1727204150.01734: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bb02a60> <<< 19665 1727204150.01738: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 19665 1727204150.01787: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.01846: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 19665 1727204150.01850: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.01915: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.01989: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.02079: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.02230: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 19665 1727204150.02235: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.02261: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.02302: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available <<< 19665 1727204150.02396: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.02495: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 19665 1727204150.02511: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5bb4e040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bb4e6d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 19665 1727204150.02524: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.02536: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.02581: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py <<< 19665 1727204150.02584: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.02718: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.02849: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available <<< 19665 1727204150.02931: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.03018: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.03052: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.03092: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 19665 1727204150.03105: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.03188: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.03212: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.03326: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.03467: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py <<< 19665 1727204150.03471: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.03567: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.03721: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available <<< 19665 1727204150.03740: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.03751: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.04192: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.04613: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py <<< 19665 1727204150.04617: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.04695: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.04791: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py <<< 19665 1727204150.04795: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.04877: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.04956: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available <<< 19665 1727204150.05089: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.05237: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 19665 1727204150.05252: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 19665 1727204150.05288: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.05349: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 19665 1727204150.05353: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.05415: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.05489: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.05659: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.05836: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 19665 1727204150.05840: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.05869: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.05906: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 19665 1727204150.05928: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.05955: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 19665 1727204150.06021: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.06077: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available <<< 19665 1727204150.06100: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.06134: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 19665 1727204150.06176: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.06229: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available <<< 19665 1727204150.06283: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.06339: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py <<< 19665 1727204150.06345: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.06551: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.06774: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 19665 1727204150.06777: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.06821: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.06882: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py <<< 19665 1727204150.06885: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.06925: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.06950: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available <<< 19665 1727204150.06974: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.07007: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 19665 1727204150.07044: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.07083: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 19665 1727204150.07086: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.07140: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.07219: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py <<< 19665 1727204150.07243: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 19665 1727204150.07261: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.07293: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.07326: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 19665 1727204150.07358: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.07376: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.07410: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.07446: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.07505: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.07576: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py <<< 19665 1727204150.07593: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available <<< 19665 1727204150.07628: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.07673: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 19665 1727204150.07837: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.08024: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py <<< 19665 1727204150.08038: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19665 1727204150.08085: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available <<< 19665 1727204150.08126: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.08174: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py <<< 19665 1727204150.08177: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.08236: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.08317: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available <<< 19665 1727204150.08385: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.08472: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py <<< 19665 1727204150.08485: stdout chunk (state=3): >>>import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 19665 1727204150.08547: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.08711: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 19665 1727204150.08744: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 19665 1727204150.08785: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5bad0eb0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bad0b80> <<< 19665 1727204150.08842: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ba7c940> <<< 19665 1727204150.09777: stdout chunk (state=3): >>>import 'gc' # <<< 19665 1727204150.11816: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py <<< 19665 1727204150.11830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' <<< 19665 1727204150.11865: stdout chunk (state=3): >>>import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bad0160> <<< 19665 1727204150.11884: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py <<< 19665 1727204150.11899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' <<< 19665 1727204150.11904: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ba92130> <<< 19665 1727204150.11960: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204150.11996: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5b8e6250> <<< 19665 1727204150.11999: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5b8e6040> <<< 19665 1727204150.12276: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 19665 1727204150.36244: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "50", "epoch": "1727204150", "epoch_int": "1727204150", "date": "2024-09-24", "time": "14:55:50", "iso8601_micro": "2024-09-24T18:55:50.102020Z", "iso8601": "2024-09-24T18:55:50Z", "iso8601_basic": "20240924T145550102020", "iso8601_basic_short": "20240924T145550", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.4, "5m": 0.35, "15m": 0.17}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(<<< 19665 1727204150.36275: stdout chunk (state=3): >>>R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2825, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 707, "free": 2825}, "nocache": {"free": 3284, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 496, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282124288, "block_size": 4096, "block_total": 65519355, "block_available": 64522003, "block_used": 997352, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19665 1727204150.36981: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache <<< 19665 1727204150.37174: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 19665 1727204150.37310: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 19665 1727204150.37320: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 19665 1727204150.37348: stdout chunk (state=3): >>># destroy zipimport # destroy _compression <<< 19665 1727204150.37363: stdout chunk (state=3): >>># destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 19665 1727204150.37428: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid <<< 19665 1727204150.37475: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 19665 1727204150.37508: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector <<< 19665 1727204150.37545: stdout chunk (state=3): >>># destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle <<< 19665 1727204150.37573: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.reduction <<< 19665 1727204150.37637: stdout chunk (state=3): >>># destroy shlex # destroy datetime # destroy base64 <<< 19665 1727204150.37679: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 19665 1727204150.37708: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection <<< 19665 1727204150.37787: stdout chunk (state=3): >>># cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid <<< 19665 1727204150.37879: stdout chunk (state=3): >>># cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading <<< 19665 1727204150.37926: stdout chunk (state=3): >>># cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 19665 1727204150.38000: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 19665 1727204150.38003: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 19665 1727204150.38060: stdout chunk (state=3): >>># destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader <<< 19665 1727204150.38065: stdout chunk (state=3): >>># destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 19665 1727204150.38230: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 19665 1727204150.38296: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath # destroy stat <<< 19665 1727204150.38317: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors <<< 19665 1727204150.38335: stdout chunk (state=3): >>># destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator <<< 19665 1727204150.38340: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 19665 1727204150.38363: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 19665 1727204150.38793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204150.38796: stdout chunk (state=3): >>><<< 19665 1727204150.38798: stderr chunk (state=3): >>><<< 19665 1727204150.39056: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cf1edc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cec33a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cf1eb20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cf1eac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cec3490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cec3940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cec3670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce7a190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce7a220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce9d850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce7a940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cedb880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce73d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce9dd90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cec3970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce3ef10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce440a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce375b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce3f6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce3e3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5cafae50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cafa940> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cafaf40> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cafad90> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0b100> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cbeedc0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cbe76a0> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cbfa700> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce45eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5cb0bd00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cbee2e0> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5cbfa310> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce4ba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0bee0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0be20> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0bd90> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cade400> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cade4f0> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb13f70> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0dac0> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0d490> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ca12250> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cac9550> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0df40> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ce4b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ca24b80> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5ca24eb0> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ca357c0> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ca35d00> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9cf430> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ca24fa0> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9df310> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ca35640> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9df3d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0ba60> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9fb730> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9fba00> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c9fb7f0> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9fb8e0> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c9fbd30> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5ca05280> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c9fb970> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c9eeac0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5cb0b640> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c9fbb20> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f9c5c3e6700> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c325850> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c325160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c325280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c325fa0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c3254f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c325dc0> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c325580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c325100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c2fa0a0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c1ff370> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c1ff070> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c1ffcd0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c30ddc0> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c30d3a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c30df40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c35af40> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c32cd60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c32c430> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c2d8af0> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c32c550> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c32c580> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c26dfa0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c36c280> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c26b820> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c36c400> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c36cc40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c26b7c0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c3051c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c36c9d0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c36c550> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c365940> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c25f910> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c27ddc0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c269550> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c25feb0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c269970> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c2ad7f0> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c27b760> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5be0a940> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c2e3730> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c3282e0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5c29e880> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bc86550> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c2ae910> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c2f7970> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c2e1850> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bb8bc70> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bdeca30> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5bdec9a0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5be2eb20> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5be2e550> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5be1f2e0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5be1f970> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5bdd02b0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bdd0a00> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bdd0940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5bbec0d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5c29a3a0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5be1f670> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5badaeb0> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bada9d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5bb47bb0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bb02a60> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5bb4e040> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bb4e6d0> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_ansible.legacy.setup_payload_7mbxpg2z/ansible_ansible.legacy.setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f9c5bad0eb0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bad0b80> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ba7c940> import 'gc' # # /usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/queues.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/queues.cpython-39.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5bad0160> # /usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/synchronize.cpython-39.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5ba92130> # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/dummy/__pycache__/connection.cpython-39.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5b8e6250> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f9c5b8e6040> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_is_chroot": false, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "50", "epoch": "1727204150", "epoch_int": "1727204150", "date": "2024-09-24", "time": "14:55:50", "iso8601_micro": "2024-09-24T18:55:50.102020Z", "iso8601": "2024-09-24T18:55:50Z", "iso8601_basic": "20240924T145550102020", "iso8601_basic_short": "20240924T145550", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.4, "5m": 0.35, "15m": 0.17}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2825, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 707, "free": 2825}, "nocache": {"free": 3284, "used": 248}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 496, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282124288, "block_size": 4096, "block_total": 65519355, "block_available": 64522003, "block_used": 997352, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing gc # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy pickle # destroy _compat_pickle # destroy queue # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping gc # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy gc # destroy unicodedata # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 19665 1727204150.40912: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204149.198032-19763-267797614457478/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204150.40916: _low_level_execute_command(): starting 19665 1727204150.40918: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204149.198032-19763-267797614457478/ > /dev/null 2>&1 && sleep 0' 19665 1727204150.42614: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204150.42618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204150.42632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204150.42766: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204150.42770: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204150.42773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204150.42853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204150.42857: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204150.42986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204150.43086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204150.44827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204150.44912: stderr chunk (state=3): >>><<< 19665 1727204150.44916: stdout chunk (state=3): >>><<< 19665 1727204150.45238: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204150.45242: handler run complete 19665 1727204150.45245: variable 'ansible_facts' from source: unknown 19665 1727204150.45249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204150.45520: variable 'ansible_facts' from source: unknown 19665 1727204150.45630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204150.45692: attempt loop complete, returning result 19665 1727204150.45696: _execute() done 19665 1727204150.45698: dumping result to json 19665 1727204150.45730: done dumping result, returning 19665 1727204150.45738: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-0dcc-3ea6-00000000007e] 19665 1727204150.45747: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000007e 19665 1727204150.46110: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000007e 19665 1727204150.46113: WORKER PROCESS EXITING ok: [managed-node3] 19665 1727204150.46413: no more pending results, returning what we have 19665 1727204150.46417: results queue empty 19665 1727204150.46418: checking for any_errors_fatal 19665 1727204150.46419: done checking for any_errors_fatal 19665 1727204150.46420: checking for max_fail_percentage 19665 1727204150.46422: done checking for max_fail_percentage 19665 1727204150.46422: checking to see if all hosts have failed and the running result is not ok 19665 1727204150.46423: done checking to see if all hosts have failed 19665 1727204150.46424: getting the remaining hosts for this loop 19665 1727204150.46427: done getting the remaining hosts for this loop 19665 1727204150.46431: getting the next task for host managed-node3 19665 1727204150.46438: done getting next task for host managed-node3 19665 1727204150.46440: ^ task is: TASK: meta (flush_handlers) 19665 1727204150.46442: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204150.46446: getting variables 19665 1727204150.46448: in VariableManager get_vars() 19665 1727204150.46484: Calling all_inventory to load vars for managed-node3 19665 1727204150.46487: Calling groups_inventory to load vars for managed-node3 19665 1727204150.46491: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204150.46502: Calling all_plugins_play to load vars for managed-node3 19665 1727204150.46505: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204150.46509: Calling groups_plugins_play to load vars for managed-node3 19665 1727204150.46682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204150.47081: done with get_vars() 19665 1727204150.47093: done getting variables 19665 1727204150.47198: in VariableManager get_vars() 19665 1727204150.47209: Calling all_inventory to load vars for managed-node3 19665 1727204150.47211: Calling groups_inventory to load vars for managed-node3 19665 1727204150.47215: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204150.47220: Calling all_plugins_play to load vars for managed-node3 19665 1727204150.47222: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204150.47225: Calling groups_plugins_play to load vars for managed-node3 19665 1727204150.47482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204150.47618: done with get_vars() 19665 1727204150.47629: done queuing things up, now waiting for results queue to drain 19665 1727204150.47630: results queue empty 19665 1727204150.47631: checking for any_errors_fatal 19665 1727204150.47632: done checking for any_errors_fatal 19665 1727204150.47633: checking for max_fail_percentage 19665 1727204150.47640: done checking for max_fail_percentage 19665 1727204150.47640: checking to see if all hosts have failed and the running result is not ok 19665 1727204150.47641: done checking to see if all hosts have failed 19665 1727204150.47641: getting the remaining hosts for this loop 19665 1727204150.47642: done getting the remaining hosts for this loop 19665 1727204150.47644: getting the next task for host managed-node3 19665 1727204150.47647: done getting next task for host managed-node3 19665 1727204150.47649: ^ task is: TASK: Include the task 'el_repo_setup.yml' 19665 1727204150.47650: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204150.47651: getting variables 19665 1727204150.47652: in VariableManager get_vars() 19665 1727204150.47662: Calling all_inventory to load vars for managed-node3 19665 1727204150.47666: Calling groups_inventory to load vars for managed-node3 19665 1727204150.47668: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204150.47672: Calling all_plugins_play to load vars for managed-node3 19665 1727204150.47674: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204150.47675: Calling groups_plugins_play to load vars for managed-node3 19665 1727204150.47758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204150.47873: done with get_vars() 19665 1727204150.47879: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:11 Tuesday 24 September 2024 14:55:50 -0400 (0:00:01.333) 0:00:01.346 ***** 19665 1727204150.47938: entering _queue_task() for managed-node3/include_tasks 19665 1727204150.47939: Creating lock for include_tasks 19665 1727204150.48189: worker is 1 (out of 1 available) 19665 1727204150.48202: exiting _queue_task() for managed-node3/include_tasks 19665 1727204150.48214: done queuing things up, now waiting for results queue to drain 19665 1727204150.48216: waiting for pending results... 19665 1727204150.48362: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 19665 1727204150.48432: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000006 19665 1727204150.48445: variable 'ansible_search_path' from source: unknown 19665 1727204150.48478: calling self._execute() 19665 1727204150.48532: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204150.48536: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204150.48547: variable 'omit' from source: magic vars 19665 1727204150.48624: _execute() done 19665 1727204150.48627: dumping result to json 19665 1727204150.48630: done dumping result, returning 19665 1727204150.48635: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [0affcd87-79f5-0dcc-3ea6-000000000006] 19665 1727204150.48643: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000006 19665 1727204150.48735: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000006 19665 1727204150.48738: WORKER PROCESS EXITING 19665 1727204150.48807: no more pending results, returning what we have 19665 1727204150.48811: in VariableManager get_vars() 19665 1727204150.48837: Calling all_inventory to load vars for managed-node3 19665 1727204150.48839: Calling groups_inventory to load vars for managed-node3 19665 1727204150.48842: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204150.48858: Calling all_plugins_play to load vars for managed-node3 19665 1727204150.48860: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204150.48865: Calling groups_plugins_play to load vars for managed-node3 19665 1727204150.48976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204150.49104: done with get_vars() 19665 1727204150.49110: variable 'ansible_search_path' from source: unknown 19665 1727204150.49120: we have included files to process 19665 1727204150.49121: generating all_blocks data 19665 1727204150.49121: done generating all_blocks data 19665 1727204150.49122: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 19665 1727204150.49123: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 19665 1727204150.49124: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 19665 1727204150.49816: in VariableManager get_vars() 19665 1727204150.49832: done with get_vars() 19665 1727204150.49845: done processing included file 19665 1727204150.49847: iterating over new_blocks loaded from include file 19665 1727204150.49849: in VariableManager get_vars() 19665 1727204150.49857: done with get_vars() 19665 1727204150.49859: filtering new block on tags 19665 1727204150.49879: done filtering new block on tags 19665 1727204150.49882: in VariableManager get_vars() 19665 1727204150.49893: done with get_vars() 19665 1727204150.49894: filtering new block on tags 19665 1727204150.49909: done filtering new block on tags 19665 1727204150.49912: in VariableManager get_vars() 19665 1727204150.49921: done with get_vars() 19665 1727204150.49923: filtering new block on tags 19665 1727204150.49934: done filtering new block on tags 19665 1727204150.49939: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 19665 1727204150.49944: extending task lists for all hosts with included blocks 19665 1727204150.50001: done extending task lists 19665 1727204150.50003: done processing included files 19665 1727204150.50003: results queue empty 19665 1727204150.50004: checking for any_errors_fatal 19665 1727204150.50005: done checking for any_errors_fatal 19665 1727204150.50006: checking for max_fail_percentage 19665 1727204150.50007: done checking for max_fail_percentage 19665 1727204150.50008: checking to see if all hosts have failed and the running result is not ok 19665 1727204150.50008: done checking to see if all hosts have failed 19665 1727204150.50009: getting the remaining hosts for this loop 19665 1727204150.50010: done getting the remaining hosts for this loop 19665 1727204150.50012: getting the next task for host managed-node3 19665 1727204150.50018: done getting next task for host managed-node3 19665 1727204150.50020: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 19665 1727204150.50022: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204150.50024: getting variables 19665 1727204150.50025: in VariableManager get_vars() 19665 1727204150.50033: Calling all_inventory to load vars for managed-node3 19665 1727204150.50035: Calling groups_inventory to load vars for managed-node3 19665 1727204150.50039: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204150.50044: Calling all_plugins_play to load vars for managed-node3 19665 1727204150.50046: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204150.50049: Calling groups_plugins_play to load vars for managed-node3 19665 1727204150.50198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204150.50401: done with get_vars() 19665 1727204150.50418: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:55:50 -0400 (0:00:00.025) 0:00:01.371 ***** 19665 1727204150.50471: entering _queue_task() for managed-node3/setup 19665 1727204150.50711: worker is 1 (out of 1 available) 19665 1727204150.50723: exiting _queue_task() for managed-node3/setup 19665 1727204150.50735: done queuing things up, now waiting for results queue to drain 19665 1727204150.50739: waiting for pending results... 19665 1727204150.50892: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 19665 1727204150.50969: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000008f 19665 1727204150.50982: variable 'ansible_search_path' from source: unknown 19665 1727204150.50985: variable 'ansible_search_path' from source: unknown 19665 1727204150.51018: calling self._execute() 19665 1727204150.51111: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204150.51115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204150.51122: variable 'omit' from source: magic vars 19665 1727204150.51489: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204150.53088: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204150.53130: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204150.53163: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204150.53187: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204150.53207: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204150.53267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204150.53290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204150.53308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204150.53334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204150.53346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204150.53464: variable 'ansible_facts' from source: unknown 19665 1727204150.53505: variable 'network_test_required_facts' from source: task vars 19665 1727204150.53533: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 19665 1727204150.53540: variable 'omit' from source: magic vars 19665 1727204150.53566: variable 'omit' from source: magic vars 19665 1727204150.53591: variable 'omit' from source: magic vars 19665 1727204150.53611: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204150.53632: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204150.53648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204150.53660: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204150.53670: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204150.53693: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204150.53696: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204150.53699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204150.53762: Set connection var ansible_connection to ssh 19665 1727204150.53770: Set connection var ansible_shell_type to sh 19665 1727204150.53775: Set connection var ansible_timeout to 10 19665 1727204150.53781: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204150.53787: Set connection var ansible_shell_executable to /bin/sh 19665 1727204150.53797: Set connection var ansible_pipelining to False 19665 1727204150.53815: variable 'ansible_shell_executable' from source: unknown 19665 1727204150.53818: variable 'ansible_connection' from source: unknown 19665 1727204150.53822: variable 'ansible_module_compression' from source: unknown 19665 1727204150.53824: variable 'ansible_shell_type' from source: unknown 19665 1727204150.53826: variable 'ansible_shell_executable' from source: unknown 19665 1727204150.53828: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204150.53831: variable 'ansible_pipelining' from source: unknown 19665 1727204150.53833: variable 'ansible_timeout' from source: unknown 19665 1727204150.53835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204150.53925: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204150.53933: variable 'omit' from source: magic vars 19665 1727204150.53940: starting attempt loop 19665 1727204150.53943: running the handler 19665 1727204150.53951: _low_level_execute_command(): starting 19665 1727204150.53961: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204150.54585: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204150.54776: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204150.54780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204150.54783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204150.54785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204150.54790: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204150.54825: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204150.56619: stdout chunk (state=3): >>>/root <<< 19665 1727204150.56771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204150.56828: stderr chunk (state=3): >>><<< 19665 1727204150.56832: stdout chunk (state=3): >>><<< 19665 1727204150.56852: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204150.56866: _low_level_execute_command(): starting 19665 1727204150.56872: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204150.5685303-19823-90294559260875 `" && echo ansible-tmp-1727204150.5685303-19823-90294559260875="` echo /root/.ansible/tmp/ansible-tmp-1727204150.5685303-19823-90294559260875 `" ) && sleep 0' 19665 1727204150.57355: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204150.57359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204150.57396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204150.57400: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204150.57402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204150.57458: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204150.57461: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204150.57518: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204150.59749: stdout chunk (state=3): >>>ansible-tmp-1727204150.5685303-19823-90294559260875=/root/.ansible/tmp/ansible-tmp-1727204150.5685303-19823-90294559260875 <<< 19665 1727204150.59852: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204150.59901: stderr chunk (state=3): >>><<< 19665 1727204150.59904: stdout chunk (state=3): >>><<< 19665 1727204150.59919: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204150.5685303-19823-90294559260875=/root/.ansible/tmp/ansible-tmp-1727204150.5685303-19823-90294559260875 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204150.59966: variable 'ansible_module_compression' from source: unknown 19665 1727204150.60004: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19665 1727204150.60054: variable 'ansible_facts' from source: unknown 19665 1727204150.60168: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204150.5685303-19823-90294559260875/AnsiballZ_setup.py 19665 1727204150.60281: Sending initial data 19665 1727204150.60290: Sent initial data (153 bytes) 19665 1727204150.60992: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204150.60996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204150.61034: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204150.61041: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204150.61044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204150.61096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204150.61099: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204150.61105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204150.61154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204150.63130: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204150.63169: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204150.63207: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpj2h41vdo /root/.ansible/tmp/ansible-tmp-1727204150.5685303-19823-90294559260875/AnsiballZ_setup.py <<< 19665 1727204150.63254: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204150.65169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204150.65433: stderr chunk (state=3): >>><<< 19665 1727204150.65440: stdout chunk (state=3): >>><<< 19665 1727204150.65442: done transferring module to remote 19665 1727204150.65444: _low_level_execute_command(): starting 19665 1727204150.65446: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204150.5685303-19823-90294559260875/ /root/.ansible/tmp/ansible-tmp-1727204150.5685303-19823-90294559260875/AnsiballZ_setup.py && sleep 0' 19665 1727204150.66042: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204150.66057: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204150.66085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204150.66107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204150.66155: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204150.66175: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204150.66189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204150.66205: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204150.66216: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204150.66227: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204150.66240: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204150.66253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204150.66271: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204150.66283: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204150.66296: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204150.66309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204150.66389: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204150.66411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204150.66428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204150.66508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204150.68946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204150.69030: stderr chunk (state=3): >>><<< 19665 1727204150.69034: stdout chunk (state=3): >>><<< 19665 1727204150.69126: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204150.69130: _low_level_execute_command(): starting 19665 1727204150.69132: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204150.5685303-19823-90294559260875/AnsiballZ_setup.py && sleep 0' 19665 1727204150.69686: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204150.69701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204150.69717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204150.69737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204150.69788: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204150.69802: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204150.69818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204150.69837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204150.69853: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204150.69868: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204150.69883: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204150.69898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204150.69915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204150.69929: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204150.69941: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204150.69956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204150.70031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204150.70056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204150.70078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204150.70171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204150.72952: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 19665 1727204150.72984: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 19665 1727204150.73073: stdout chunk (state=3): >>>import '_io' # <<< 19665 1727204150.73087: stdout chunk (state=3): >>>import 'marshal' # <<< 19665 1727204150.73144: stdout chunk (state=3): >>>import 'posix' # <<< 19665 1727204150.73189: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 19665 1727204150.73203: stdout chunk (state=3): >>># installing zipimport hook <<< 19665 1727204150.73255: stdout chunk (state=3): >>>import 'time' # <<< 19665 1727204150.73280: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 19665 1727204150.73360: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204150.73396: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 19665 1727204150.73453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' <<< 19665 1727204150.73456: stdout chunk (state=3): >>>import '_codecs' # <<< 19665 1727204150.73486: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020bb3dc0> <<< 19665 1727204150.73553: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 19665 1727204150.73586: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' <<< 19665 1727204150.73589: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020bb3b20> <<< 19665 1727204150.73632: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' <<< 19665 1727204150.73673: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020bb3ac0> <<< 19665 1727204150.73699: stdout chunk (state=3): >>>import '_signal' # <<< 19665 1727204150.73744: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py <<< 19665 1727204150.73747: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' <<< 19665 1727204150.73778: stdout chunk (state=3): >>>import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b58490> <<< 19665 1727204150.73807: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py <<< 19665 1727204150.73818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 19665 1727204150.73851: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py <<< 19665 1727204150.73863: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' <<< 19665 1727204150.73904: stdout chunk (state=3): >>>import '_abc' # <<< 19665 1727204150.73916: stdout chunk (state=3): >>>import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b58940> <<< 19665 1727204150.73947: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b58670> <<< 19665 1727204150.73983: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py <<< 19665 1727204150.74013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 19665 1727204150.74043: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 19665 1727204150.74076: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 19665 1727204150.74111: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 19665 1727204150.74140: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 19665 1727204150.74178: stdout chunk (state=3): >>>import '_stat' # <<< 19665 1727204150.74191: stdout chunk (state=3): >>>import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b0f190> <<< 19665 1727204150.74226: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 19665 1727204150.74251: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 19665 1727204150.74363: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b0f220> <<< 19665 1727204150.74396: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py <<< 19665 1727204150.74419: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 19665 1727204150.74475: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' <<< 19665 1727204150.74489: stdout chunk (state=3): >>>import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b32850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b0f940> <<< 19665 1727204150.74544: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b70880> <<< 19665 1727204150.74580: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py <<< 19665 1727204150.74592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b08d90> <<< 19665 1727204150.74670: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 19665 1727204150.74700: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b32d90> <<< 19665 1727204150.74795: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b58970> <<< 19665 1727204150.74838: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 19665 1727204150.75380: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 19665 1727204150.75418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 19665 1727204150.75456: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 19665 1727204150.75493: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py <<< 19665 1727204150.75527: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 19665 1727204150.75569: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py <<< 19665 1727204150.75591: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 19665 1727204150.75605: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020ad2f10> <<< 19665 1727204150.75704: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020ad90a0> <<< 19665 1727204150.75711: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py <<< 19665 1727204150.75735: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 19665 1727204150.75769: stdout chunk (state=3): >>>import '_sre' # <<< 19665 1727204150.75796: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 19665 1727204150.75833: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' <<< 19665 1727204150.75871: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 19665 1727204150.75909: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020acc5b0> <<< 19665 1727204150.75933: stdout chunk (state=3): >>>import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020ad36a0> <<< 19665 1727204150.75959: stdout chunk (state=3): >>>import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020ad23d0> <<< 19665 1727204150.75993: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 19665 1727204150.76087: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 19665 1727204150.76140: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 19665 1727204150.76172: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204150.76201: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py <<< 19665 1727204150.76232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 19665 1727204150.76284: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.76297: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020792eb0> <<< 19665 1727204150.76304: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207929a0> <<< 19665 1727204150.76403: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020792fa0> <<< 19665 1727204150.76439: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py <<< 19665 1727204150.76442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 19665 1727204150.76483: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020792df0> <<< 19665 1727204150.76524: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py <<< 19665 1727204150.76541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a2160> <<< 19665 1727204150.76575: stdout chunk (state=3): >>>import '_collections' # <<< 19665 1727204150.76637: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020aaee20> <<< 19665 1727204150.76657: stdout chunk (state=3): >>>import '_functools' # <<< 19665 1727204150.76697: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020aa6700> <<< 19665 1727204150.76782: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' <<< 19665 1727204150.76802: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020aba760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020adaeb0> <<< 19665 1727204150.76844: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 19665 1727204150.76892: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.76923: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f60207a2d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020aae340> <<< 19665 1727204150.76985: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.77006: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020aba370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020ae0a60> <<< 19665 1727204150.77050: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py <<< 19665 1727204150.77061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' <<< 19665 1727204150.77095: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py <<< 19665 1727204150.77106: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204150.77152: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 19665 1727204150.77180: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' <<< 19665 1727204150.77192: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a2f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a2e80> <<< 19665 1727204150.77255: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' <<< 19665 1727204150.77273: stdout chunk (state=3): >>>import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a2df0> <<< 19665 1727204150.77294: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py <<< 19665 1727204150.77314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' <<< 19665 1727204150.77361: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 19665 1727204150.77380: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 19665 1727204150.77403: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 19665 1727204150.77543: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020776460> <<< 19665 1727204150.77621: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 19665 1727204150.77624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 19665 1727204150.77658: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020776550> <<< 19665 1727204150.77847: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207540d0> <<< 19665 1727204150.77903: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a5b20> <<< 19665 1727204150.77933: stdout chunk (state=3): >>>import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a54c0> <<< 19665 1727204150.77958: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 19665 1727204150.77982: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 19665 1727204150.78030: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 19665 1727204150.78056: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' <<< 19665 1727204150.78085: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 19665 1727204150.78110: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' <<< 19665 1727204150.78125: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206aa2b0> <<< 19665 1727204150.78172: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020761d60> <<< 19665 1727204150.78251: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a5fa0> <<< 19665 1727204150.78269: stdout chunk (state=3): >>>import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020ae00d0> <<< 19665 1727204150.78298: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 19665 1727204150.78341: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' <<< 19665 1727204150.78372: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py <<< 19665 1727204150.78386: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' <<< 19665 1727204150.78394: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206babe0> <<< 19665 1727204150.78425: stdout chunk (state=3): >>>import 'errno' # <<< 19665 1727204150.78474: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.78485: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f60206baf10> <<< 19665 1727204150.78525: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py <<< 19665 1727204150.78540: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' <<< 19665 1727204150.78571: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 19665 1727204150.78577: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' <<< 19665 1727204150.78604: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206cd820> <<< 19665 1727204150.78634: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 19665 1727204150.78686: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 19665 1727204150.78727: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206cdd60> <<< 19665 1727204150.78773: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.78789: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.78795: stdout chunk (state=3): >>>import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020666490> <<< 19665 1727204150.78820: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206baf40> <<< 19665 1727204150.78845: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py <<< 19665 1727204150.78866: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 19665 1727204150.78922: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.78938: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020676370> <<< 19665 1727204150.78954: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206cd6a0> <<< 19665 1727204150.78979: stdout chunk (state=3): >>>import 'pwd' # <<< 19665 1727204150.79025: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.79031: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020676430> <<< 19665 1727204150.79088: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a2ac0> <<< 19665 1727204150.79123: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 19665 1727204150.79152: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' <<< 19665 1727204150.79181: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py <<< 19665 1727204150.79217: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 19665 1727204150.79263: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.79279: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020692790> <<< 19665 1727204150.79310: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py <<< 19665 1727204150.79315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' <<< 19665 1727204150.79375: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.79399: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020692a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020692850> <<< 19665 1727204150.79429: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.79449: stdout chunk (state=3): >>>import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020692940> <<< 19665 1727204150.79511: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py <<< 19665 1727204150.79534: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 19665 1727204150.79795: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.79798: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.79802: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020692d90> <<< 19665 1727204150.79861: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.79865: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.79884: stdout chunk (state=3): >>>import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f602069c2e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206929d0> <<< 19665 1727204150.79923: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020686b20> <<< 19665 1727204150.79956: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a26a0> <<< 19665 1727204150.79986: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py <<< 19665 1727204150.80075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' <<< 19665 1727204150.80125: stdout chunk (state=3): >>>import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020692b80> <<< 19665 1727204150.80332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/cp437.pyc' <<< 19665 1727204150.80367: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f60205ba760> <<< 19665 1727204150.80716: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip' <<< 19665 1727204150.80738: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.80878: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.80923: stdout chunk (state=3): >>>import ansible # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/__init__.py <<< 19665 1727204150.80944: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.80969: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.80990: stdout chunk (state=3): >>>import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/__init__.py <<< 19665 1727204150.81021: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.82968: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.84568: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py <<< 19665 1727204150.84572: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' <<< 19665 1727204150.84594: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffac8b0> <<< 19665 1727204150.84632: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py <<< 19665 1727204150.84646: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204150.84678: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py <<< 19665 1727204150.84698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' <<< 19665 1727204150.84735: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py <<< 19665 1727204150.84743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' <<< 19665 1727204150.84785: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.84796: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.84801: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601ffac160> <<< 19665 1727204150.84861: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffac280> <<< 19665 1727204150.84917: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffac5e0> <<< 19665 1727204150.84943: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py <<< 19665 1727204150.84958: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 19665 1727204150.85023: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffac4f0> <<< 19665 1727204150.85035: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fface20> <<< 19665 1727204150.85059: stdout chunk (state=3): >>>import 'atexit' # <<< 19665 1727204150.85104: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601ffac580> <<< 19665 1727204150.85185: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 19665 1727204150.85239: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffac100> <<< 19665 1727204150.85242: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 19665 1727204150.85270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' <<< 19665 1727204150.85288: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 19665 1727204150.85321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 19665 1727204150.85335: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 19665 1727204150.85424: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff41040> <<< 19665 1727204150.85482: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fe893d0> <<< 19665 1727204150.85523: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fe890d0> <<< 19665 1727204150.85530: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 19665 1727204150.85573: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fe89d30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff94d90> <<< 19665 1727204150.85781: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff943a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' <<< 19665 1727204150.85815: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff94f40> <<< 19665 1727204150.85818: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 19665 1727204150.85860: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 19665 1727204150.85880: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 19665 1727204150.85915: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60205baa90> <<< 19665 1727204150.85997: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff6adc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff6a490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffa9a90> <<< 19665 1727204150.86032: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601ff6a5b0> <<< 19665 1727204150.86074: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff6a5e0> <<< 19665 1727204150.86112: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' <<< 19665 1727204150.86126: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 19665 1727204150.86157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 19665 1727204150.86227: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fef4f70> <<< 19665 1727204150.86261: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fff52e0> <<< 19665 1727204150.86279: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py <<< 19665 1727204150.86282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 19665 1727204150.86327: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fef17f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fff5460> <<< 19665 1727204150.86358: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 19665 1727204150.86387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204150.86421: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' <<< 19665 1727204150.86439: stdout chunk (state=3): >>>import '_string' # <<< 19665 1727204150.86481: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fff5c40> <<< 19665 1727204150.86612: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fef1790> <<< 19665 1727204150.86703: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fff5130> <<< 19665 1727204150.86746: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204150.86775: stdout chunk (state=3): >>># extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fff5670> <<< 19665 1727204150.86803: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fff5730> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffee9a0> <<< 19665 1727204150.86834: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 19665 1727204150.86852: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 19665 1727204150.86907: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fee78e0> <<< 19665 1727204150.87085: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601ff05c70> <<< 19665 1727204150.87088: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fef0520> <<< 19665 1727204150.87144: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fee7e80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fef0940> <<< 19665 1727204150.87170: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py <<< 19665 1727204150.87184: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.87257: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.87359: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.87395: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py <<< 19665 1727204150.87398: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.87494: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.87586: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.88347: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.88877: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py<<< 19665 1727204150.88887: stdout chunk (state=3): >>> <<< 19665 1727204150.88895: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 19665 1727204150.88912: stdout chunk (state=3): >>> <<< 19665 1727204150.88927: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # <<< 19665 1727204150.88939: stdout chunk (state=3): >>> <<< 19665 1727204150.88950: stdout chunk (state=3): >>>import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py<<< 19665 1727204150.88955: stdout chunk (state=3): >>> <<< 19665 1727204150.88994: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py<<< 19665 1727204150.88999: stdout chunk (state=3): >>> <<< 19665 1727204150.89029: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc'<<< 19665 1727204150.89034: stdout chunk (state=3): >>> <<< 19665 1727204150.89128: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so'<<< 19665 1727204150.89150: stdout chunk (state=3): >>> # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601ff00790> <<< 19665 1727204150.89398: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff3f850> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fac0fa0> <<< 19665 1727204150.89413: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py <<< 19665 1727204150.89441: stdout chunk (state=3): >>># zipimport: zlib available<<< 19665 1727204150.89444: stdout chunk (state=3): >>> <<< 19665 1727204150.89472: stdout chunk (state=3): >>># zipimport: zlib available<<< 19665 1727204150.89477: stdout chunk (state=3): >>> <<< 19665 1727204150.89517: stdout chunk (state=3): >>>import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/_text.py<<< 19665 1727204150.89521: stdout chunk (state=3): >>> <<< 19665 1727204150.89530: stdout chunk (state=3): >>># zipimport: zlib available<<< 19665 1727204150.89546: stdout chunk (state=3): >>> <<< 19665 1727204150.89741: stdout chunk (state=3): >>># zipimport: zlib available<<< 19665 1727204150.89746: stdout chunk (state=3): >>> <<< 19665 1727204150.89978: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py<<< 19665 1727204150.89996: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 19665 1727204150.90070: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff72310> <<< 19665 1727204150.90081: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.90684: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.91239: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19665 1727204150.91253: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available <<< 19665 1727204150.91290: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.91329: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py <<< 19665 1727204150.91333: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.91386: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.91470: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available <<< 19665 1727204150.91503: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py <<< 19665 1727204150.91506: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.91536: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.91575: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available <<< 19665 1727204150.91767: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.91954: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 19665 1727204150.91988: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 19665 1727204150.92071: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffb2ca0> # zipimport: zlib available <<< 19665 1727204150.92138: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.92213: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py <<< 19665 1727204150.92216: stdout chunk (state=3): >>>import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py <<< 19665 1727204150.92241: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.92274: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.92311: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 19665 1727204150.92360: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.92389: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.92486: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.92546: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 19665 1727204150.92576: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204150.92663: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601ff23c70> <<< 19665 1727204150.93189: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffb2bb0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 19665 1727204150.93301: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff022b0> <<< 19665 1727204150.93367: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff80b80> <<< 19665 1727204150.93451: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f932eb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py <<< 19665 1727204150.93455: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.93488: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.93525: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py <<< 19665 1727204150.93659: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/basic.py <<< 19665 1727204150.93662: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/modules/__init__.py <<< 19665 1727204150.93691: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.93766: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.93846: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.93870: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.93889: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.93949: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.93999: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.94052: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.94117: stdout chunk (state=3): >>>import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py<<< 19665 1727204150.94610: stdout chunk (state=3): >>> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available <<< 19665 1727204150.94634: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.94879: stdout chunk (state=3): >>># zipimport: zlib available<<< 19665 1727204150.94883: stdout chunk (state=3): >>> <<< 19665 1727204150.94925: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.95000: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py<<< 19665 1727204150.95009: stdout chunk (state=3): >>> <<< 19665 1727204150.95028: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc'<<< 19665 1727204150.95034: stdout chunk (state=3): >>> <<< 19665 1727204150.95062: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py<<< 19665 1727204150.95069: stdout chunk (state=3): >>> <<< 19665 1727204150.95090: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc'<<< 19665 1727204150.95095: stdout chunk (state=3): >>> <<< 19665 1727204150.95125: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py<<< 19665 1727204150.95129: stdout chunk (state=3): >>> <<< 19665 1727204150.95154: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc'<<< 19665 1727204150.95156: stdout chunk (state=3): >>> <<< 19665 1727204150.95198: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f81d100><<< 19665 1727204150.95201: stdout chunk (state=3): >>> <<< 19665 1727204150.95233: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py<<< 19665 1727204150.95245: stdout chunk (state=3): >>> <<< 19665 1727204150.95259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc'<<< 19665 1727204150.95260: stdout chunk (state=3): >>> <<< 19665 1727204150.95286: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py<<< 19665 1727204150.95292: stdout chunk (state=3): >>> <<< 19665 1727204150.95346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc'<<< 19665 1727204150.95348: stdout chunk (state=3): >>> <<< 19665 1727204150.95394: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fa7ea60> <<< 19665 1727204150.95450: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fa7e9d0> <<< 19665 1727204150.95527: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fa56c70> <<< 19665 1727204150.95530: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fa56c10> <<< 19665 1727204150.95565: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fac5bb0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fac5c40> <<< 19665 1727204150.95594: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py <<< 19665 1727204150.95606: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' <<< 19665 1727204150.95632: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' <<< 19665 1727204150.95690: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fa65310> <<< 19665 1727204150.95721: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fa659a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py <<< 19665 1727204150.95731: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' <<< 19665 1727204150.95754: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fa65940> <<< 19665 1727204150.95775: stdout chunk (state=3): >>># /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' <<< 19665 1727204150.95811: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601f87f0d0> <<< 19665 1727204150.95839: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fffec40> <<< 19665 1727204150.95882: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fac5880> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py <<< 19665 1727204150.95924: stdout chunk (state=3): >>>import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py <<< 19665 1727204150.95937: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.95983: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.96080: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available <<< 19665 1727204150.96096: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.96142: stdout chunk (state=3): >>>import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py <<< 19665 1727204150.96171: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py <<< 19665 1727204150.96194: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.96214: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.96234: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available <<< 19665 1727204150.96295: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.96325: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available <<< 19665 1727204150.96366: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.96410: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py <<< 19665 1727204150.96426: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.96469: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.96533: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.96566: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.96633: stdout chunk (state=3): >>>import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available <<< 19665 1727204150.97023: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.97396: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available <<< 19665 1727204150.97439: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.97483: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.97520: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.97567: stdout chunk (state=3): >>>import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py <<< 19665 1727204150.97579: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.97603: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.97637: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py <<< 19665 1727204150.97640: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.97674: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.97734: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available <<< 19665 1727204150.97761: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.97787: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py <<< 19665 1727204150.97801: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.97834: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.97876: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available <<< 19665 1727204150.97929: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.98001: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' <<< 19665 1727204150.98025: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f76ff10> <<< 19665 1727204150.98046: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py <<< 19665 1727204150.98083: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' <<< 19665 1727204150.98230: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f76f9d0> <<< 19665 1727204150.98260: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available <<< 19665 1727204150.98305: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.98372: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py <<< 19665 1727204150.98376: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.98459: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.98543: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py <<< 19665 1727204150.98555: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.98598: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.98680: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py <<< 19665 1727204150.98683: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.98713: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.98747: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py <<< 19665 1727204150.98770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' <<< 19665 1727204150.98916: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601f798c10> <<< 19665 1727204150.99173: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f7e1c40> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available <<< 19665 1727204150.99227: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.99272: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py <<< 19665 1727204150.99275: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.99341: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.99410: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.99504: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.99654: stdout chunk (state=3): >>>import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py <<< 19665 1727204150.99657: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.99682: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.99731: stdout chunk (state=3): >>>import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available <<< 19665 1727204150.99773: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.99823: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py <<< 19665 1727204150.99826: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' <<< 19665 1727204150.99898: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601f7e35e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f7e3790> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available <<< 19665 1727204150.99901: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py <<< 19665 1727204150.99928: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.99957: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204150.99995: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available <<< 19665 1727204151.00129: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.00270: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py <<< 19665 1727204151.00273: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.00361: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.00435: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.00462: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.00512: stdout chunk (state=3): >>>import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py <<< 19665 1727204151.00529: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.00593: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.00622: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.00739: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.00864: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available <<< 19665 1727204151.00967: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.01083: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py <<< 19665 1727204151.01100: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.01116: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.01146: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.01573: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.01999: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available <<< 19665 1727204151.02088: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.02177: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available <<< 19665 1727204151.02263: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.02358: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py <<< 19665 1727204151.02361: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.02482: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.02626: stdout chunk (state=3): >>>import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py <<< 19665 1727204151.02650: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.02663: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available <<< 19665 1727204151.02689: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.02742: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py <<< 19665 1727204151.02745: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.02823: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.02906: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.03077: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.03253: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py <<< 19665 1727204151.03256: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.03288: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.03339: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available <<< 19665 1727204151.03353: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.03385: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available <<< 19665 1727204151.03452: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.03505: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py <<< 19665 1727204151.03523: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.03535: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.03569: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available <<< 19665 1727204151.03623: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.03685: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py <<< 19665 1727204151.03688: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.03729: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.03786: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available <<< 19665 1727204151.04005: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04216: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py <<< 19665 1727204151.04235: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04272: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04329: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available <<< 19665 1727204151.04363: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04395: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py <<< 19665 1727204151.04409: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04433: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04472: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available <<< 19665 1727204151.04505: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04549: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py <<< 19665 1727204151.04552: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04610: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04708: stdout chunk (state=3): >>>import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available <<< 19665 1727204151.04719: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py <<< 19665 1727204151.04731: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04766: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04807: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available <<< 19665 1727204151.04836: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04848: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04897: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04933: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.04990: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.05074: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py <<< 19665 1727204151.05078: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py <<< 19665 1727204151.05091: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.05130: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.05171: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available <<< 19665 1727204151.05345: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.05762: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available <<< 19665 1727204151.05843: stdout chunk (state=3): >>>import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py <<< 19665 1727204151.05846: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.05910: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.05992: stdout chunk (state=3): >>>import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py <<< 19665 1727204151.06067: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.06240: stdout chunk (state=3): >>>import 'gc' # <<< 19665 1727204151.06650: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' <<< 19665 1727204151.06685: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' <<< 19665 1727204151.06726: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601f583dc0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f555cd0> <<< 19665 1727204151.06788: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f5550a0> <<< 19665 1727204151.07778: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "51", "epoch": "1727204151", "epoch_int": "1727204151", "date": "2024-09-24", "time": "14:55:51", "iso8601_micro": "2024-09-24T18:55:51.065193Z", "iso8601": "2024-09-24T18:55:51Z", "iso8601_basic": "20240924T145551065193", "iso8601_basic_short": "20240924T145551", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_service_mgr": "systemd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19665 1727204151.08416: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout <<< 19665 1727204151.08488: stdout chunk (state=3): >>># restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors <<< 19665 1727204151.08526: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 19665 1727204151.08530: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector <<< 19665 1727204151.08601: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 19665 1727204151.08874: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 19665 1727204151.08888: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 19665 1727204151.08923: stdout chunk (state=3): >>># destroy zipimport <<< 19665 1727204151.08967: stdout chunk (state=3): >>># destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma <<< 19665 1727204151.08981: stdout chunk (state=3): >>># destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings <<< 19665 1727204151.09003: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 19665 1727204151.09046: stdout chunk (state=3): >>># destroy selinux # destroy distro # destroy logging # destroy argparse <<< 19665 1727204151.09101: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 19665 1727204151.09122: stdout chunk (state=3): >>># destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 19665 1727204151.09161: stdout chunk (state=3): >>># destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction <<< 19665 1727204151.09183: stdout chunk (state=3): >>># destroy shlex <<< 19665 1727204151.09213: stdout chunk (state=3): >>># destroy datetime # destroy base64 <<< 19665 1727204151.09228: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json <<< 19665 1727204151.09248: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 19665 1727204151.09310: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 19665 1727204151.09396: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform <<< 19665 1727204151.09513: stdout chunk (state=3): >>># destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re <<< 19665 1727204151.09539: stdout chunk (state=3): >>># destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse <<< 19665 1727204151.09579: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 19665 1727204151.09603: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 19665 1727204151.09625: stdout chunk (state=3): >>># destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 19665 1727204151.09771: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 19665 1727204151.09809: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath <<< 19665 1727204151.09845: stdout chunk (state=3): >>># destroy stat <<< 19665 1727204151.09871: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 19665 1727204151.09909: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 19665 1727204151.10243: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204151.10363: stderr chunk (state=3): >>><<< 19665 1727204151.10369: stdout chunk (state=3): >>><<< 19665 1727204151.10617: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020bb3dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b583a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020bb3b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020bb3ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b58490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b58940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b58670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b0f190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b0f220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b32850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b0f940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b70880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b08d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b32d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020b58970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020ad2f10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020ad90a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020acc5b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020ad36a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020ad23d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020792eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207929a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020792fa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020792df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a2160> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020aaee20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020aa6700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020aba760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020adaeb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f60207a2d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020aae340> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020aba370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020ae0a60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a2f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a2e80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a2df0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020776460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020776550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207540d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a5b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a54c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206aa2b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020761d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a5fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020ae00d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206babe0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f60206baf10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206cd820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206cdd60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020666490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206baf40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020676370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206cd6a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020676430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a2ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020692790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020692a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020692850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020692940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6020692d90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f602069c2e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60206929d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020686b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60207a26a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6020692b80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f60205ba760> # zipimport: found 103 names in '/tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffac8b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601ffac160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffac280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffac5e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffac4f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fface20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601ffac580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffac100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff41040> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fe893d0> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fe890d0> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fe89d30> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff94d90> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff943a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff94f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f60205baa90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff6adc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff6a490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffa9a90> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601ff6a5b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff6a5e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fef4f70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fff52e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fef17f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fff5460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fff5c40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fef1790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fff5130> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fff5670> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fff5730> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffee9a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fee78e0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601ff05c70> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fef0520> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fee7e80> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fef0940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601ff00790> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff3f850> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fac0fa0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff72310> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffb2ca0> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601ff23c70> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ffb2bb0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff022b0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601ff80b80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f932eb0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.namespace # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/namespace.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.typing # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/compat/typing.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/__init__.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/context.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/context.cpython-39.pyc' # /usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/process.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/process.cpython-39.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f81d100> # /usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/reduction.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/reduction.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc matches /usr/lib64/python3.9/pickle.py # code object from '/usr/lib64/python3.9/__pycache__/pickle.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc matches /usr/lib64/python3.9/_compat_pickle.py # code object from '/usr/lib64/python3.9/__pycache__/_compat_pickle.cpython-39.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fa7ea60> # extension module '_pickle' loaded from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.9/lib-dynload/_pickle.cpython-39-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fa7e9d0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fa56c70> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fa56c10> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fac5bb0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fac5c40> # /usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/pool.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/pool.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc matches /usr/lib64/python3.9/queue.py # code object from '/usr/lib64/python3.9/__pycache__/queue.cpython-39.pyc' # extension module '_queue' loaded from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.9/lib-dynload/_queue.cpython-39-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601fa65310> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fa659a0> # /usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/util.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/util.cpython-39.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fa65940> # /usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc matches /usr/lib64/python3.9/multiprocessing/connection.py # code object from '/usr/lib64/python3.9/multiprocessing/__pycache__/connection.cpython-39.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.9/lib-dynload/_multiprocessing.cpython-39-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601f87f0d0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fffec40> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601fac5880> import ansible.module_utils.facts.timeout # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/timeout.py import ansible.module_utils.facts.collector # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/collector.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/other/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.facter # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/other/facter.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.other.ohai # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/other/ohai.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.apparmor # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/apparmor.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.caps # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/caps.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.chroot # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/chroot.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.utils # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/utils.py import ansible.module_utils.facts.system.cmdline # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/cmdline.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.distribution # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/distribution.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.datetime # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/compat/datetime.py import ansible.module_utils.facts.system.date_time # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/date_time.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.env # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/env.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.dns # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/dns.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.fips # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/fips.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.loadavg # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/loadavg.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc matches /usr/lib64/python3.9/glob.py # code object from '/usr/lib64/python3.9/__pycache__/glob.cpython-39.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f76ff10> # /usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc matches /usr/lib64/python3.9/configparser.py # code object from '/usr/lib64/python3.9/__pycache__/configparser.cpython-39.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f76f9d0> import ansible.module_utils.facts.system.local # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/local.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.lsb # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/lsb.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.pkg_mgr # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/pkg_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.platform # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/platform.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc matches /usr/lib64/python3.9/ssl.py # code object from '/usr/lib64/python3.9/__pycache__/ssl.cpython-39.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.9/lib-dynload/_ssl.cpython-39-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601f798c10> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f7e1c40> import ansible.module_utils.facts.system.python # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/python.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.selinux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/selinux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat.version # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/compat/version.py import ansible.module_utils.facts.system.service_mgr # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/service_mgr.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.system.ssh_pub_keys # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/ssh_pub_keys.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc matches /usr/lib64/python3.9/getpass.py # code object from '/usr/lib64/python3.9/__pycache__/getpass.cpython-39.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.9/lib-dynload/termios.cpython-39-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601f7e35e0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f7e3790> import ansible.module_utils.facts.system.user # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/system/user.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.base # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/base.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.aix # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/aix.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.sysctl # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/sysctl.py import ansible.module_utils.facts.hardware.darwin # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/darwin.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.freebsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/freebsd.py import ansible.module_utils.facts.hardware.dragonfly # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.hpux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hpux.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.linux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/linux.py import ansible.module_utils.facts.hardware.hurd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.netbsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.openbsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.hardware.sunos # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/hardware/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.base # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.generic_bsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/generic_bsd.py import ansible.module_utils.facts.network.aix # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/aix.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.darwin # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/darwin.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.dragonfly # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.fc_wwn # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/fc_wwn.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.freebsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/freebsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hpux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.hurd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/hurd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.linux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.iscsi # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/iscsi.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.nvme # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/nvme.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.netbsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.openbsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.network.sunos # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/network/sunos.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.base # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/base.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sysctl # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sysctl.py import ansible.module_utils.facts.virtual.freebsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/freebsd.py import ansible.module_utils.facts.virtual.dragonfly # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/dragonfly.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.hpux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/hpux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.linux # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/linux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.netbsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/netbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.openbsd # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/openbsd.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.virtual.sunos # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/virtual/sunos.py import ansible.module_utils.facts.default_collectors # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/default_collectors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.facts.ansible_collector # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/ansible_collector.py import ansible.module_utils.facts.compat # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/compat.py import ansible.module_utils.facts # loaded from Zip /tmp/ansible_setup_payload_p6byyjnh/ansible_setup_payload.zip/ansible/module_utils/facts/__init__.py # zipimport: zlib available import 'gc' # # /usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc matches /usr/lib64/python3.9/encodings/idna.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/idna.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc matches /usr/lib64/python3.9/stringprep.py # code object from '/usr/lib64/python3.9/__pycache__/stringprep.cpython-39.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.9/lib-dynload/unicodedata.cpython-39-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f601f583dc0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f555cd0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f601f5550a0> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "51", "epoch": "1727204151", "epoch_int": "1727204151", "date": "2024-09-24", "time": "14:55:51", "iso8601_micro": "2024-09-24T18:55:51.065193Z", "iso8601": "2024-09-24T18:55:51Z", "iso8601_basic": "20240924T145551065193", "iso8601_basic_short": "20240924T145551", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_service_mgr": "systemd", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. [WARNING]: Module invocation had junk after the JSON data: # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing gc # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy selinux # destroy distro # destroy logging # destroy argparse # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy queue # destroy multiprocessing.process # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy shlex # destroy datetime # destroy base64 # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy json # destroy socket # destroy struct # destroy glob # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping unicodedata # cleanup[3] wiping gc # cleanup[3] wiping termios # cleanup[3] wiping _ssl # cleanup[3] wiping configparser # cleanup[3] wiping _multiprocessing # cleanup[3] wiping _queue # cleanup[3] wiping _pickle # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy unicodedata # destroy gc # destroy termios # destroy _ssl # destroy _multiprocessing # destroy _queue # destroy _pickle # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 19665 1727204151.12084: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204150.5685303-19823-90294559260875/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204151.12088: _low_level_execute_command(): starting 19665 1727204151.12091: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204150.5685303-19823-90294559260875/ > /dev/null 2>&1 && sleep 0' 19665 1727204151.13670: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204151.13675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204151.13805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.13810: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204151.13813: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.13879: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204151.14089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204151.14093: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204151.14151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204151.15940: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204151.16022: stderr chunk (state=3): >>><<< 19665 1727204151.16026: stdout chunk (state=3): >>><<< 19665 1727204151.16276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204151.16280: handler run complete 19665 1727204151.16282: variable 'ansible_facts' from source: unknown 19665 1727204151.16285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204151.16287: variable 'ansible_facts' from source: unknown 19665 1727204151.16323: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204151.16393: attempt loop complete, returning result 19665 1727204151.16402: _execute() done 19665 1727204151.16409: dumping result to json 19665 1727204151.16427: done dumping result, returning 19665 1727204151.16440: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [0affcd87-79f5-0dcc-3ea6-00000000008f] 19665 1727204151.16458: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000008f ok: [managed-node3] 19665 1727204151.16810: no more pending results, returning what we have 19665 1727204151.16815: results queue empty 19665 1727204151.16816: checking for any_errors_fatal 19665 1727204151.16818: done checking for any_errors_fatal 19665 1727204151.16819: checking for max_fail_percentage 19665 1727204151.16821: done checking for max_fail_percentage 19665 1727204151.16822: checking to see if all hosts have failed and the running result is not ok 19665 1727204151.16822: done checking to see if all hosts have failed 19665 1727204151.16823: getting the remaining hosts for this loop 19665 1727204151.16825: done getting the remaining hosts for this loop 19665 1727204151.16829: getting the next task for host managed-node3 19665 1727204151.16839: done getting next task for host managed-node3 19665 1727204151.16841: ^ task is: TASK: Check if system is ostree 19665 1727204151.16844: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204151.16847: getting variables 19665 1727204151.16849: in VariableManager get_vars() 19665 1727204151.16875: Calling all_inventory to load vars for managed-node3 19665 1727204151.16878: Calling groups_inventory to load vars for managed-node3 19665 1727204151.16881: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204151.16892: Calling all_plugins_play to load vars for managed-node3 19665 1727204151.16895: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204151.16898: Calling groups_plugins_play to load vars for managed-node3 19665 1727204151.17108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204151.17422: done with get_vars() 19665 1727204151.17433: done getting variables 19665 1727204151.17527: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000008f 19665 1727204151.17530: WORKER PROCESS EXITING TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:55:51 -0400 (0:00:00.673) 0:00:02.044 ***** 19665 1727204151.17790: entering _queue_task() for managed-node3/stat 19665 1727204151.19513: worker is 1 (out of 1 available) 19665 1727204151.19525: exiting _queue_task() for managed-node3/stat 19665 1727204151.19537: done queuing things up, now waiting for results queue to drain 19665 1727204151.19538: waiting for pending results... 19665 1727204151.20416: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 19665 1727204151.20518: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000091 19665 1727204151.20530: variable 'ansible_search_path' from source: unknown 19665 1727204151.20535: variable 'ansible_search_path' from source: unknown 19665 1727204151.20573: calling self._execute() 19665 1727204151.20652: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204151.20656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204151.20665: variable 'omit' from source: magic vars 19665 1727204151.21883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204151.22299: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204151.22344: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204151.22379: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204151.22410: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204151.22701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204151.22725: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204151.22754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204151.22782: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204151.23211: Evaluated conditional (not __network_is_ostree is defined): True 19665 1727204151.23218: variable 'omit' from source: magic vars 19665 1727204151.23264: variable 'omit' from source: magic vars 19665 1727204151.23305: variable 'omit' from source: magic vars 19665 1727204151.23333: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204151.23362: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204151.23516: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204151.23535: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204151.23549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204151.23584: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204151.23587: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204151.23590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204151.23712: Set connection var ansible_connection to ssh 19665 1727204151.23720: Set connection var ansible_shell_type to sh 19665 1727204151.23726: Set connection var ansible_timeout to 10 19665 1727204151.23730: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204151.23740: Set connection var ansible_shell_executable to /bin/sh 19665 1727204151.23748: Set connection var ansible_pipelining to False 19665 1727204151.23774: variable 'ansible_shell_executable' from source: unknown 19665 1727204151.23777: variable 'ansible_connection' from source: unknown 19665 1727204151.23780: variable 'ansible_module_compression' from source: unknown 19665 1727204151.23782: variable 'ansible_shell_type' from source: unknown 19665 1727204151.23785: variable 'ansible_shell_executable' from source: unknown 19665 1727204151.23787: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204151.23789: variable 'ansible_pipelining' from source: unknown 19665 1727204151.23793: variable 'ansible_timeout' from source: unknown 19665 1727204151.23797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204151.23949: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204151.23958: variable 'omit' from source: magic vars 19665 1727204151.23965: starting attempt loop 19665 1727204151.23969: running the handler 19665 1727204151.23982: _low_level_execute_command(): starting 19665 1727204151.23989: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204151.25293: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204151.25305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204151.25316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204151.25332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204151.25381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204151.25388: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204151.25398: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.25412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204151.25420: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204151.25427: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204151.25435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204151.25448: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204151.25460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204151.25470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204151.25476: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204151.25487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.25591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204151.25596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204151.25603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204151.25836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204151.27959: stdout chunk (state=3): >>>/root <<< 19665 1727204151.28223: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204151.28226: stdout chunk (state=3): >>><<< 19665 1727204151.28230: stderr chunk (state=3): >>><<< 19665 1727204151.28234: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204151.28248: _low_level_execute_command(): starting 19665 1727204151.28251: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204151.2821827-19849-121249443055880 `" && echo ansible-tmp-1727204151.2821827-19849-121249443055880="` echo /root/.ansible/tmp/ansible-tmp-1727204151.2821827-19849-121249443055880 `" ) && sleep 0' 19665 1727204151.29877: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204151.30563: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204151.30575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204151.30591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204151.30714: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204151.30720: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204151.30730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.30747: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204151.30756: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204151.30762: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204151.30773: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204151.30782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204151.30794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204151.30800: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204151.30807: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204151.30818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.30898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204151.30985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204151.30996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204151.31382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204151.33201: stdout chunk (state=3): >>>ansible-tmp-1727204151.2821827-19849-121249443055880=/root/.ansible/tmp/ansible-tmp-1727204151.2821827-19849-121249443055880 <<< 19665 1727204151.33671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204151.33675: stderr chunk (state=3): >>><<< 19665 1727204151.33678: stdout chunk (state=3): >>><<< 19665 1727204151.33680: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204151.2821827-19849-121249443055880=/root/.ansible/tmp/ansible-tmp-1727204151.2821827-19849-121249443055880 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204151.33682: variable 'ansible_module_compression' from source: unknown 19665 1727204151.33684: ANSIBALLZ: Using lock for stat 19665 1727204151.33686: ANSIBALLZ: Acquiring lock 19665 1727204151.33688: ANSIBALLZ: Lock acquired: 140619596463376 19665 1727204151.33690: ANSIBALLZ: Creating module 19665 1727204151.53610: ANSIBALLZ: Writing module into payload 19665 1727204151.54062: ANSIBALLZ: Writing module 19665 1727204151.54088: ANSIBALLZ: Renaming module 19665 1727204151.54092: ANSIBALLZ: Done creating module 19665 1727204151.54113: variable 'ansible_facts' from source: unknown 19665 1727204151.54180: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204151.2821827-19849-121249443055880/AnsiballZ_stat.py 19665 1727204151.54824: Sending initial data 19665 1727204151.54827: Sent initial data (153 bytes) 19665 1727204151.59483: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204151.59591: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204151.59617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204151.59638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204151.59688: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204151.59781: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204151.59799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.59821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204151.59836: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204151.59849: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204151.59862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204151.59879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204151.59895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204151.59908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204151.59919: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204151.59939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.60020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204151.60172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204151.60193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204151.60273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204151.61989: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204151.62015: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204151.62048: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmptokmesi1 /root/.ansible/tmp/ansible-tmp-1727204151.2821827-19849-121249443055880/AnsiballZ_stat.py <<< 19665 1727204151.62088: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204151.63474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204151.63651: stderr chunk (state=3): >>><<< 19665 1727204151.63654: stdout chunk (state=3): >>><<< 19665 1727204151.63657: done transferring module to remote 19665 1727204151.63659: _low_level_execute_command(): starting 19665 1727204151.63661: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204151.2821827-19849-121249443055880/ /root/.ansible/tmp/ansible-tmp-1727204151.2821827-19849-121249443055880/AnsiballZ_stat.py && sleep 0' 19665 1727204151.64796: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204151.64862: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204151.64879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204151.64896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204151.64939: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204151.64980: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204151.64994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.65010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204151.65021: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204151.65030: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204151.65073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204151.65090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204151.65105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204151.65115: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204151.65125: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204151.65137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.65270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204151.65384: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204151.65407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204151.65529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204151.67993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204151.68071: stderr chunk (state=3): >>><<< 19665 1727204151.68074: stdout chunk (state=3): >>><<< 19665 1727204151.68168: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204151.68171: _low_level_execute_command(): starting 19665 1727204151.68174: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204151.2821827-19849-121249443055880/AnsiballZ_stat.py && sleep 0' 19665 1727204151.69619: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204151.69634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204151.69648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204151.69667: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204151.69799: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204151.69821: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204151.69835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.69853: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204151.69865: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204151.69875: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204151.69886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204151.69897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204151.69919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204151.69930: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204151.69940: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204151.69953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.70037: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204151.70158: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204151.70174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204151.70270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204151.73054: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin <<< 19665 1727204151.73069: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 19665 1727204151.73114: stdout chunk (state=3): >>>import '_io' # <<< 19665 1727204151.73117: stdout chunk (state=3): >>>import 'marshal' # <<< 19665 1727204151.73136: stdout chunk (state=3): >>>import 'posix' # <<< 19665 1727204151.73168: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 19665 1727204151.73228: stdout chunk (state=3): >>>import 'time' # <<< 19665 1727204151.73231: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 19665 1727204151.73262: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204151.73294: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py <<< 19665 1727204151.73307: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # <<< 19665 1727204151.73345: stdout chunk (state=3): >>>import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318173dc0> <<< 19665 1727204151.73386: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py <<< 19665 1727204151.73403: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3181183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318173b20> <<< 19665 1727204151.73422: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318173ac0> <<< 19665 1727204151.73490: stdout chunk (state=3): >>>import '_signal' # <<< 19665 1727204151.73494: stdout chunk (state=3): >>># /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318118490> <<< 19665 1727204151.73530: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' <<< 19665 1727204151.73544: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318118940> <<< 19665 1727204151.73582: stdout chunk (state=3): >>>import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318118670> <<< 19665 1727204151.73603: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' <<< 19665 1727204151.73631: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py <<< 19665 1727204151.73649: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' <<< 19665 1727204151.73702: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py <<< 19665 1727204151.73718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' <<< 19665 1727204151.73732: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py <<< 19665 1727204151.73745: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' <<< 19665 1727204151.73823: stdout chunk (state=3): >>>import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180cf220> <<< 19665 1727204151.73859: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' <<< 19665 1727204151.73875: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180cf940> <<< 19665 1727204151.73898: stdout chunk (state=3): >>>import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318130880> <<< 19665 1727204151.73928: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180c8d90> <<< 19665 1727204151.73990: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' <<< 19665 1727204151.73993: stdout chunk (state=3): >>>import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180f2d90> <<< 19665 1727204151.74048: stdout chunk (state=3): >>>import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318118970> <<< 19665 1727204151.74072: stdout chunk (state=3): >>>Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 19665 1727204151.74272: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py <<< 19665 1727204151.74315: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' <<< 19665 1727204151.74319: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' <<< 19665 1727204151.74354: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' <<< 19665 1727204151.74392: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' <<< 19665 1727204151.74395: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31806df10> <<< 19665 1727204151.74441: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180740a0> <<< 19665 1727204151.74477: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' <<< 19665 1727204151.74508: stdout chunk (state=3): >>>import '_sre' # <<< 19665 1727204151.74537: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py <<< 19665 1727204151.74561: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' <<< 19665 1727204151.74594: stdout chunk (state=3): >>>import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180675b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31806e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31806d3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py <<< 19665 1727204151.74654: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' <<< 19665 1727204151.74681: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py <<< 19665 1727204151.74720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204151.74733: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' <<< 19665 1727204151.74790: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317dd6eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317dd69a0> <<< 19665 1727204151.74795: stdout chunk (state=3): >>>import 'itertools' # <<< 19665 1727204151.74830: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317dd6fa0> <<< 19665 1727204151.74882: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' <<< 19665 1727204151.74886: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317dd6df0> <<< 19665 1727204151.74903: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de6160> <<< 19665 1727204151.74919: stdout chunk (state=3): >>>import '_collections' # <<< 19665 1727204151.74972: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318049e20> import '_functools' # <<< 19665 1727204151.74985: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318041700> <<< 19665 1727204151.75047: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318055760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318075eb0> <<< 19665 1727204151.75083: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' <<< 19665 1727204151.75096: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317de6d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318049340> <<< 19665 1727204151.75144: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc318055370> <<< 19665 1727204151.75182: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31807ba60> <<< 19665 1727204151.75217: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py <<< 19665 1727204151.75238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de6f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de6e80> <<< 19665 1727204151.75294: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de6df0> <<< 19665 1727204151.75322: stdout chunk (state=3): >>># /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py <<< 19665 1727204151.75332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' <<< 19665 1727204151.75345: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py <<< 19665 1727204151.75409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' <<< 19665 1727204151.75437: stdout chunk (state=3): >>># /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317dba460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py <<< 19665 1727204151.75450: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' <<< 19665 1727204151.75478: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317dba550> <<< 19665 1727204151.75607: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317d980d0> <<< 19665 1727204151.75638: stdout chunk (state=3): >>>import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de9b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de94c0> <<< 19665 1727204151.75667: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py <<< 19665 1727204151.75677: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' <<< 19665 1727204151.75694: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py <<< 19665 1727204151.75743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py <<< 19665 1727204151.75755: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cd42b0> <<< 19665 1727204151.75782: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317da5d60> <<< 19665 1727204151.75843: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de9fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31807b0d0> <<< 19665 1727204151.75869: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py <<< 19665 1727204151.75897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317ce4be0> <<< 19665 1727204151.75921: stdout chunk (state=3): >>>import 'errno' # <<< 19665 1727204151.75971: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204151.76002: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317ce4f10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py <<< 19665 1727204151.76023: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cf7820> <<< 19665 1727204151.76036: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py <<< 19665 1727204151.76061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' <<< 19665 1727204151.76091: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cf7d60> <<< 19665 1727204151.76135: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204151.76167: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317c85490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317ce4f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' <<< 19665 1727204151.76207: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317c95370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cf76a0> <<< 19665 1727204151.76227: stdout chunk (state=3): >>>import 'pwd' # <<< 19665 1727204151.76248: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317c95430> <<< 19665 1727204151.76291: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de6ac0> <<< 19665 1727204151.76312: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py <<< 19665 1727204151.76339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' <<< 19665 1727204151.76393: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317cb1790> <<< 19665 1727204151.76424: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317cb1a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cb1850> <<< 19665 1727204151.76448: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317cb1940> <<< 19665 1727204151.76482: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' <<< 19665 1727204151.77297: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317cb1d90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317cbb2e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cb19d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317ca5b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de66a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cb1b80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc317bda760> # zipimport: found 30 names in '/tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available <<< 19665 1727204151.78460: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.79428: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175d68b0> <<< 19665 1727204151.79460: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204151.79496: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204151.79501: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc3175d6160> <<< 19665 1727204151.79524: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175d6280> <<< 19665 1727204151.79575: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175d65e0> <<< 19665 1727204151.79588: stdout chunk (state=3): >>># /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' <<< 19665 1727204151.79628: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175d64f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175d6e20> import 'atexit' # <<< 19665 1727204151.79656: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc3175d6580> <<< 19665 1727204151.79690: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py <<< 19665 1727204151.79703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' <<< 19665 1727204151.79759: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175d6100> <<< 19665 1727204151.79762: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py <<< 19665 1727204151.79782: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py <<< 19665 1727204151.79808: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' <<< 19665 1727204151.79833: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' <<< 19665 1727204151.79893: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31752dfd0> <<< 19665 1727204151.79933: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31754bc40> <<< 19665 1727204151.79980: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31754bf40> <<< 19665 1727204151.79983: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py <<< 19665 1727204151.80002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' <<< 19665 1727204151.80051: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31754b2e0> <<< 19665 1727204151.80055: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b69d90> <<< 19665 1727204151.80232: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b693a0> <<< 19665 1727204151.80271: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py <<< 19665 1727204151.80276: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b69f40> <<< 19665 1727204151.80319: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' <<< 19665 1727204151.80328: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' <<< 19665 1727204151.80343: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' <<< 19665 1727204151.80372: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317bdaa90> <<< 19665 1727204151.80450: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175a9dc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175a9490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175e0580> <<< 19665 1727204151.80487: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc3175a95b0> <<< 19665 1727204151.80537: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175a95e0> <<< 19665 1727204151.80551: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py <<< 19665 1727204151.80592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py <<< 19665 1727204151.80595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' <<< 19665 1727204151.80675: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' <<< 19665 1727204151.80678: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31751ef70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b4a2e0> <<< 19665 1727204151.80689: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' <<< 19665 1727204151.80744: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31751b7f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b4a460> <<< 19665 1727204151.80761: stdout chunk (state=3): >>># /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py <<< 19665 1727204151.80796: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204151.80829: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # <<< 19665 1727204151.80885: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b61f40> <<< 19665 1727204151.81006: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31751b790> <<< 19665 1727204151.81091: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31751b5e0> <<< 19665 1727204151.81123: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31751a550> <<< 19665 1727204151.81172: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31751a490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b409a0> <<< 19665 1727204151.81198: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' <<< 19665 1727204151.81227: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py <<< 19665 1727204151.81230: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' <<< 19665 1727204151.81273: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31759f6a0> <<< 19665 1727204151.81454: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31759ebb0> <<< 19665 1727204151.81457: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175af0d0> <<< 19665 1727204151.81512: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31759f100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175e2c40> <<< 19665 1727204151.81538: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.81541: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available <<< 19665 1727204151.81601: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.81690: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19665 1727204151.81709: stdout chunk (state=3): >>>import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py <<< 19665 1727204151.81743: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py <<< 19665 1727204151.81746: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.81826: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.81926: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.82377: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.82855: stdout chunk (state=3): >>>import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py <<< 19665 1727204151.82872: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204151.82927: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc3174e7940> <<< 19665 1727204151.83005: stdout chunk (state=3): >>># /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31759cd30> <<< 19665 1727204151.83008: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175937c0> <<< 19665 1727204151.83072: stdout chunk (state=3): >>>import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available <<< 19665 1727204151.83101: stdout chunk (state=3): >>># zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/_text.py <<< 19665 1727204151.83104: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.83212: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.83338: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' <<< 19665 1727204151.83366: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31759e4c0> # zipimport: zlib available <<< 19665 1727204151.83759: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.84116: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.84172: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.84240: stdout chunk (state=3): >>>import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/collections.py <<< 19665 1727204151.84243: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.84269: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.84303: stdout chunk (state=3): >>>import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available <<< 19665 1727204151.84363: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.84454: stdout chunk (state=3): >>>import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/errors.py <<< 19665 1727204151.84472: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 19665 1727204151.84476: stdout chunk (state=3): >>>import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available <<< 19665 1727204151.84503: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.84546: stdout chunk (state=3): >>>import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py <<< 19665 1727204151.84549: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.84721: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.84914: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py <<< 19665 1727204151.84951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # <<< 19665 1727204151.85026: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317082940> <<< 19665 1727204151.85031: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.85076: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.85182: stdout chunk (state=3): >>>import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/validation.py <<< 19665 1727204151.85186: stdout chunk (state=3): >>>import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available <<< 19665 1727204151.85209: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.85247: stdout chunk (state=3): >>>import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available <<< 19665 1727204151.85292: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.85322: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.85430: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.85488: stdout chunk (state=3): >>># /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py <<< 19665 1727204151.85518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' <<< 19665 1727204151.85585: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317b54b50> <<< 19665 1727204151.85613: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31708afa0> <<< 19665 1727204151.85652: stdout chunk (state=3): >>>import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available <<< 19665 1727204151.85770: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.85839: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.85855: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.85889: stdout chunk (state=3): >>># /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' <<< 19665 1727204151.85917: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py <<< 19665 1727204151.85966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' <<< 19665 1727204151.85981: stdout chunk (state=3): >>># /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' <<< 19665 1727204151.86064: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3170d26d0> <<< 19665 1727204151.86099: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3174ddc10> <<< 19665 1727204151.86161: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3174dc5b0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available <<< 19665 1727204151.86205: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.86219: stdout chunk (state=3): >>>import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py <<< 19665 1727204151.86315: stdout chunk (state=3): >>>import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available <<< 19665 1727204151.86320: stdout chunk (state=3): >>># zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/modules/__init__.py <<< 19665 1727204151.86334: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.86431: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.86594: stdout chunk (state=3): >>># zipimport: zlib available <<< 19665 1727204151.86732: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 19665 1727204151.87055: stdout chunk (state=3): >>># clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ <<< 19665 1727204151.87171: stdout chunk (state=3): >>># restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale <<< 19665 1727204151.87208: stdout chunk (state=3): >>># cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 19665 1727204151.87425: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 19665 1727204151.87479: stdout chunk (state=3): >>># destroy importlib.util # destroy importlib.abc # destroy importlib.machinery <<< 19665 1727204151.87514: stdout chunk (state=3): >>># destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse <<< 19665 1727204151.87620: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 19665 1727204151.87651: stdout chunk (state=3): >>># destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno <<< 19665 1727204151.87708: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal <<< 19665 1727204151.87859: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize <<< 19665 1727204151.87906: stdout chunk (state=3): >>># destroy _heapq # destroy posixpath <<< 19665 1727204151.87920: stdout chunk (state=3): >>># destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal <<< 19665 1727204151.87932: stdout chunk (state=3): >>># destroy _frozen_importlib # clear sys.audit hooks <<< 19665 1727204151.88284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204151.88325: stderr chunk (state=3): >>><<< 19665 1727204151.88328: stdout chunk (state=3): >>><<< 19665 1727204151.88482: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/encodings/__init__.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc matches /usr/lib64/python3.9/codecs.py # code object from '/usr/lib64/python3.9/__pycache__/codecs.cpython-39.pyc' import '_codecs' # import 'codecs' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318173dc0> # /usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc matches /usr/lib64/python3.9/encodings/aliases.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/aliases.cpython-39.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3181183a0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318173b20> # /usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc matches /usr/lib64/python3.9/encodings/utf_8.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/utf_8.cpython-39.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318173ac0> import '_signal' # # /usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc matches /usr/lib64/python3.9/encodings/latin_1.py # code object from '/usr/lib64/python3.9/encodings/__pycache__/latin_1.cpython-39.pyc' import 'encodings.latin_1' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318118490> # /usr/lib64/python3.9/__pycache__/io.cpython-39.pyc matches /usr/lib64/python3.9/io.py # code object from '/usr/lib64/python3.9/__pycache__/io.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/abc.py # code object from '/usr/lib64/python3.9/__pycache__/abc.cpython-39.pyc' import '_abc' # import 'abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318118940> import 'io' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318118670> # /usr/lib64/python3.9/__pycache__/site.cpython-39.pyc matches /usr/lib64/python3.9/site.py # code object from '/usr/lib64/python3.9/__pycache__/site.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/os.cpython-39.pyc matches /usr/lib64/python3.9/os.py # code object from '/usr/lib64/python3.9/__pycache__/os.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc matches /usr/lib64/python3.9/stat.py # code object from '/usr/lib64/python3.9/__pycache__/stat.cpython-39.pyc' import '_stat' # import 'stat' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180cf190> # /usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc matches /usr/lib64/python3.9/_collections_abc.py # code object from '/usr/lib64/python3.9/__pycache__/_collections_abc.cpython-39.pyc' import '_collections_abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180cf220> # /usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc matches /usr/lib64/python3.9/posixpath.py # code object from '/usr/lib64/python3.9/__pycache__/posixpath.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc matches /usr/lib64/python3.9/genericpath.py # code object from '/usr/lib64/python3.9/__pycache__/genericpath.cpython-39.pyc' import 'genericpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180f2850> import 'posixpath' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180cf940> import 'os' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318130880> # /usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc matches /usr/lib64/python3.9/_sitebuiltins.py # code object from '/usr/lib64/python3.9/__pycache__/_sitebuiltins.cpython-39.pyc' import '_sitebuiltins' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180c8d90> # /usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc matches /usr/lib64/python3.9/_bootlocale.py # code object from '/usr/lib64/python3.9/__pycache__/_bootlocale.cpython-39.pyc' import '_locale' # import '_bootlocale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180f2d90> import 'site' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318118970> Python 3.9.19 (main, Aug 23 2024, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-2)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc matches /usr/lib64/python3.9/base64.py # code object from '/usr/lib64/python3.9/__pycache__/base64.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/re.cpython-39.pyc matches /usr/lib64/python3.9/re.py # code object from '/usr/lib64/python3.9/__pycache__/re.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc matches /usr/lib64/python3.9/enum.py # code object from '/usr/lib64/python3.9/__pycache__/enum.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/types.cpython-39.pyc matches /usr/lib64/python3.9/types.py # code object from '/usr/lib64/python3.9/__pycache__/types.cpython-39.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31806df10> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180740a0> # /usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc matches /usr/lib64/python3.9/sre_compile.py # code object from '/usr/lib64/python3.9/__pycache__/sre_compile.cpython-39.pyc' import '_sre' # # /usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc matches /usr/lib64/python3.9/sre_parse.py # code object from '/usr/lib64/python3.9/__pycache__/sre_parse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc matches /usr/lib64/python3.9/sre_constants.py # code object from '/usr/lib64/python3.9/__pycache__/sre_constants.cpython-39.pyc' import 'sre_constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3180675b0> import 'sre_parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31806e6a0> import 'sre_compile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31806d3d0> # /usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc matches /usr/lib64/python3.9/functools.py # code object from '/usr/lib64/python3.9/__pycache__/functools.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/collections/__init__.py # code object from '/usr/lib64/python3.9/collections/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc matches /usr/lib64/python3.9/heapq.py # code object from '/usr/lib64/python3.9/__pycache__/heapq.cpython-39.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.9/lib-dynload/_heapq.cpython-39-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317dd6eb0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317dd69a0> import 'itertools' # # /usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc matches /usr/lib64/python3.9/keyword.py # code object from '/usr/lib64/python3.9/__pycache__/keyword.cpython-39.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317dd6fa0> # /usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc matches /usr/lib64/python3.9/operator.py # code object from '/usr/lib64/python3.9/__pycache__/operator.cpython-39.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317dd6df0> # /usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc matches /usr/lib64/python3.9/reprlib.py # code object from '/usr/lib64/python3.9/__pycache__/reprlib.cpython-39.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de6160> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318049e20> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318041700> # /usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc matches /usr/lib64/python3.9/copyreg.py # code object from '/usr/lib64/python3.9/__pycache__/copyreg.cpython-39.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318055760> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318075eb0> # /usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc matches /usr/lib64/python3.9/struct.py # code object from '/usr/lib64/python3.9/__pycache__/struct.cpython-39.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.9/lib-dynload/_struct.cpython-39-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317de6d60> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc318049340> # extension module 'binascii' loaded from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.9/lib-dynload/binascii.cpython-39-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc318055370> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31807ba60> # /usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc matches /usr/lib64/python3.9/runpy.py # code object from '/usr/lib64/python3.9/__pycache__/runpy.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/importlib/__init__.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc matches /usr/lib64/python3.9/warnings.py # code object from '/usr/lib64/python3.9/__pycache__/warnings.cpython-39.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de6f40> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de6e80> # /usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc matches /usr/lib64/python3.9/importlib/machinery.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/machinery.cpython-39.pyc' import 'importlib.machinery' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de6df0> # /usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc matches /usr/lib64/python3.9/importlib/util.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/util.cpython-39.pyc' # /usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/importlib/abc.py # code object from '/usr/lib64/python3.9/importlib/__pycache__/abc.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc matches /usr/lib64/python3.9/typing.py # code object from '/usr/lib64/python3.9/__pycache__/typing.cpython-39.pyc' # /usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc matches /usr/lib64/python3.9/collections/abc.py # code object from '/usr/lib64/python3.9/collections/__pycache__/abc.cpython-39.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317dba460> # /usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc matches /usr/lib64/python3.9/contextlib.py # code object from '/usr/lib64/python3.9/__pycache__/contextlib.cpython-39.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317dba550> import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317d980d0> import 'importlib.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de9b20> import 'importlib.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de94c0> # /usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc matches /usr/lib64/python3.9/pkgutil.py # code object from '/usr/lib64/python3.9/__pycache__/pkgutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc matches /usr/lib64/python3.9/weakref.py # code object from '/usr/lib64/python3.9/__pycache__/weakref.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc matches /usr/lib64/python3.9/_weakrefset.py # code object from '/usr/lib64/python3.9/__pycache__/_weakrefset.cpython-39.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cd42b0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317da5d60> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de9fa0> import 'runpy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31807b0d0> # /usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc matches /usr/lib64/python3.9/shutil.py # code object from '/usr/lib64/python3.9/__pycache__/shutil.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc matches /usr/lib64/python3.9/fnmatch.py # code object from '/usr/lib64/python3.9/__pycache__/fnmatch.cpython-39.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317ce4be0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.9/lib-dynload/zlib.cpython-39-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317ce4f10> # /usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc matches /usr/lib64/python3.9/bz2.py # code object from '/usr/lib64/python3.9/__pycache__/bz2.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc matches /usr/lib64/python3.9/_compression.py # code object from '/usr/lib64/python3.9/__pycache__/_compression.cpython-39.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cf7820> # /usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc matches /usr/lib64/python3.9/threading.py # code object from '/usr/lib64/python3.9/__pycache__/threading.cpython-39.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cf7d60> # extension module '_bz2' loaded from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.9/lib-dynload/_bz2.cpython-39-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317c85490> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317ce4f40> # /usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc matches /usr/lib64/python3.9/lzma.py # code object from '/usr/lib64/python3.9/__pycache__/lzma.cpython-39.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.9/lib-dynload/_lzma.cpython-39-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317c95370> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cf76a0> import 'pwd' # # extension module 'grp' loaded from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.9/lib-dynload/grp.cpython-39-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317c95430> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de6ac0> # /usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc matches /usr/lib64/python3.9/tempfile.py # code object from '/usr/lib64/python3.9/__pycache__/tempfile.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/random.cpython-39.pyc matches /usr/lib64/python3.9/random.py # code object from '/usr/lib64/python3.9/__pycache__/random.cpython-39.pyc' # extension module 'math' loaded from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.9/lib-dynload/math.cpython-39-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317cb1790> # /usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc matches /usr/lib64/python3.9/bisect.py # code object from '/usr/lib64/python3.9/__pycache__/bisect.cpython-39.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.9/lib-dynload/_bisect.cpython-39-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317cb1a60> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cb1850> # extension module '_random' loaded from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.9/lib-dynload/_random.cpython-39-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317cb1940> # /usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc matches /usr/lib64/python3.9/hashlib.py # code object from '/usr/lib64/python3.9/__pycache__/hashlib.cpython-39.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.9/lib-dynload/_hashlib.cpython-39-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317cb1d90> # extension module '_blake2' loaded from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.9/lib-dynload/_blake2.cpython-39-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317cbb2e0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cb19d0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317ca5b20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317de66a0> # /usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc matches /usr/lib64/python3.9/zipfile.py # code object from '/usr/lib64/python3.9/__pycache__/zipfile.cpython-39.pyc' import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317cb1b80> # code object from '/usr/lib64/python3.9/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc317bda760> # zipimport: found 30 names in '/tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available import ansible # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/__init__.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc matches /usr/lib64/python3.9/__future__.py # code object from '/usr/lib64/python3.9/__pycache__/__future__.cpython-39.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175d68b0> # /usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/json/__init__.py # code object from '/usr/lib64/python3.9/json/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc matches /usr/lib64/python3.9/json/decoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/decoder.cpython-39.pyc' # /usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc matches /usr/lib64/python3.9/json/scanner.py # code object from '/usr/lib64/python3.9/json/__pycache__/scanner.cpython-39.pyc' # extension module '_json' loaded from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.9/lib-dynload/_json.cpython-39-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc3175d6160> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175d6280> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175d65e0> # /usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc matches /usr/lib64/python3.9/json/encoder.py # code object from '/usr/lib64/python3.9/json/__pycache__/encoder.cpython-39.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175d64f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175d6e20> import 'atexit' # # extension module 'fcntl' loaded from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.9/lib-dynload/fcntl.cpython-39-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc3175d6580> # /usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc matches /usr/lib64/python3.9/locale.py # code object from '/usr/lib64/python3.9/__pycache__/locale.cpython-39.pyc' import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175d6100> # /usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc matches /usr/lib64/python3.9/platform.py # code object from '/usr/lib64/python3.9/__pycache__/platform.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc matches /usr/lib64/python3.9/subprocess.py # code object from '/usr/lib64/python3.9/__pycache__/subprocess.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc matches /usr/lib64/python3.9/signal.py # code object from '/usr/lib64/python3.9/__pycache__/signal.cpython-39.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31752dfd0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.9/lib-dynload/_posixsubprocess.cpython-39-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31754bc40> # extension module 'select' loaded from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.9/lib-dynload/select.cpython-39-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31754bf40> # /usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc matches /usr/lib64/python3.9/selectors.py # code object from '/usr/lib64/python3.9/__pycache__/selectors.cpython-39.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31754b2e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b69d90> import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b693a0> # /usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc matches /usr/lib64/python3.9/shlex.py # code object from '/usr/lib64/python3.9/__pycache__/shlex.cpython-39.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b69f40> # /usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc matches /usr/lib64/python3.9/traceback.py # code object from '/usr/lib64/python3.9/__pycache__/traceback.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc matches /usr/lib64/python3.9/linecache.py # code object from '/usr/lib64/python3.9/__pycache__/linecache.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc matches /usr/lib64/python3.9/tokenize.py # code object from '/usr/lib64/python3.9/__pycache__/tokenize.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/token.cpython-39.pyc matches /usr/lib64/python3.9/token.py # code object from '/usr/lib64/python3.9/__pycache__/token.cpython-39.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317bdaa90> import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175a9dc0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175a9490> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175e0580> # extension module 'syslog' loaded from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.9/lib-dynload/syslog.cpython-39-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc3175a95b0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/__init__.cpython-39.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175a95e0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/journal.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc matches /usr/lib64/python3.9/datetime.py # code object from '/usr/lib64/python3.9/__pycache__/datetime.cpython-39.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.9/lib-dynload/_datetime.cpython-39-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31751ef70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b4a2e0> # /usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc matches /usr/lib64/python3.9/uuid.py # code object from '/usr/lib64/python3.9/__pycache__/uuid.cpython-39.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.9/lib-dynload/_uuid.cpython-39-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31751b7f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b4a460> # /usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/logging/__init__.py # code object from '/usr/lib64/python3.9/logging/__pycache__/__init__.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/string.cpython-39.pyc matches /usr/lib64/python3.9/string.py # code object from '/usr/lib64/python3.9/__pycache__/string.cpython-39.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b61f40> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31751b790> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.9/site-packages/systemd/_journal.cpython-39-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31751b5e0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.9/site-packages/systemd/_reader.cpython-39-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31751a550> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.9/site-packages/systemd/id128.cpython-39-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31751a490> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317b409a0> # /usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.9/site-packages/systemd/__pycache__/daemon.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc matches /usr/lib64/python3.9/socket.py # code object from '/usr/lib64/python3.9/__pycache__/socket.cpython-39.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.9/lib-dynload/_socket.cpython-39-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31759f6a0> # extension module 'array' loaded from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.9/lib-dynload/array.cpython-39-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31759ebb0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175af0d0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.9/site-packages/systemd/_daemon.cpython-39-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc31759f100> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175e2c40> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.compat # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/compat/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/text/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.six # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/six/__init__.py import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import ansible.module_utils.common.text.converters # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/text/converters.py # /usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/__init__.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/__init__.cpython-39.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.9/lib-dynload/_ctypes.cpython-39-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc3174e7940> # /usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc matches /usr/lib64/python3.9/ctypes/_endian.py # code object from '/usr/lib64/python3.9/ctypes/__pycache__/_endian.cpython-39.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31759cd30> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3175937c0> import ansible.module_utils.compat.selinux # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/compat/selinux.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils._text # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/_text.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc matches /usr/lib64/python3.9/copy.py # code object from '/usr/lib64/python3.9/__pycache__/copy.cpython-39.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31759e4c0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.collections # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/collections.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.warnings # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/warnings.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.errors # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/errors.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/parsing/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.parsing.convert_bool # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/parsing/convert_bool.py # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc matches /usr/lib64/python3.9/ast.py # code object from '/usr/lib64/python3.9/__pycache__/ast.cpython-39.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc317082940> # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.text.formatters # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/text/formatters.py import ansible.module_utils.common.validation # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/validation.py import ansible.module_utils.common.parameters # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/parameters.py import ansible.module_utils.common.arg_spec # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/arg_spec.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common.locale # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/locale.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc matches /usr/lib64/python3.9/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.9/site-packages/selinux/__pycache__/__init__.cpython-39.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.9/site-packages/selinux/_selinux.cpython-39-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc317b54b50> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc31708afa0> import ansible.module_utils.common.file # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/file.py import ansible.module_utils.common.process # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/process.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc matches /usr/lib/python3.9/site-packages/distro.py # code object from '/usr/lib/python3.9/site-packages/__pycache__/distro.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc matches /usr/lib64/python3.9/argparse.py # code object from '/usr/lib64/python3.9/__pycache__/argparse.cpython-39.pyc' # /usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc matches /usr/lib64/python3.9/gettext.py # code object from '/usr/lib64/python3.9/__pycache__/gettext.cpython-39.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3170d26d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3174ddc10> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc3174dc5b0> # destroy ansible.module_utils.distro import ansible.module_utils.distro # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/distro/__init__.py # zipimport: zlib available # zipimport: zlib available import ansible.module_utils.common._utils # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/_utils.py import ansible.module_utils.common.sys_info # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/common/sys_info.py import ansible.module_utils.basic # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/module_utils/basic.py # zipimport: zlib available # zipimport: zlib available import ansible.modules # loaded from Zip /tmp/ansible_stat_payload_en7u44hg/ansible_stat_payload.zip/ansible/modules/__init__.py # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.path_hooks # clear sys.path_importer_cache # clear sys.meta_path # clear sys.__interactivehook__ # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing encodings.latin_1 # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing _locale # cleanup[2] removing _bootlocale # destroy _bootlocale # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing sre_constants # destroy sre_constants # cleanup[2] removing sre_parse # cleanup[2] removing sre_compile # cleanup[2] removing _heapq # cleanup[2] removing heapq # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing collections.abc # cleanup[2] removing contextlib # cleanup[2] removing typing # destroy typing # cleanup[2] removing importlib.abc # cleanup[2] removing importlib.util # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing threading # cleanup[2] removing _bz2 # destroy _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing pwd # cleanup[2] removing grp # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing tempfile # cleanup[2] removing zipfile # destroy zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing fcntl # cleanup[2] removing locale # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing subprocess # cleanup[2] removing platform # cleanup[2] removing shlex # cleanup[2] removing token # destroy token # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.util # destroy importlib.abc # destroy importlib.machinery # destroy zipimport # destroy _compression # destroy binascii # destroy importlib # destroy struct # destroy bz2 # destroy lzma # destroy __main__ # destroy locale # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy encodings # destroy syslog # destroy uuid # destroy array # destroy datetime # destroy selinux # destroy distro # destroy json # destroy shlex # destroy logging # destroy argparse # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # cleanup[3] wiping tokenize # cleanup[3] wiping platform # destroy subprocess # cleanup[3] wiping selectors # cleanup[3] wiping select # cleanup[3] wiping _posixsubprocess # cleanup[3] wiping signal # cleanup[3] wiping fcntl # cleanup[3] wiping atexit # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping _blake2 # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping shutil # destroy fnmatch # cleanup[3] wiping grp # cleanup[3] wiping pwd # cleanup[3] wiping _lzma # cleanup[3] wiping threading # cleanup[3] wiping zlib # cleanup[3] wiping errno # cleanup[3] wiping weakref # cleanup[3] wiping contextlib # cleanup[3] wiping collections.abc # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy enum # destroy sre_compile # destroy copyreg # cleanup[3] wiping functools # cleanup[3] wiping _functools # destroy _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy heapq # destroy collections.abc # cleanup[3] wiping _collections # destroy _collections # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping itertools # cleanup[3] wiping _heapq # cleanup[3] wiping sre_parse # cleanup[3] wiping _sre # cleanup[3] wiping types # cleanup[3] wiping _locale # destroy _locale # cleanup[3] wiping os # cleanup[3] wiping os.path # destroy genericpath # cleanup[3] wiping posixpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.latin_1 # cleanup[3] wiping _signal # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy systemd._daemon # destroy _socket # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy fcntl # destroy _blake2 # destroy _lzma # destroy zlib # destroy _signal # destroy platform # destroy _uuid # destroy _sre # destroy sre_parse # destroy tokenize # destroy _heapq # destroy posixpath # destroy stat # destroy ansible.module_utils.six.moves.urllib # destroy errno # destroy signal # destroy contextlib # destroy pwd # destroy grp # destroy _posixsubprocess # destroy selectors # destroy select # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy itertools # destroy operator # destroy ansible.module_utils.six.moves # destroy _operator # destroy _frozen_importlib_external # destroy _imp # destroy io # destroy marshal # destroy _frozen_importlib # clear sys.audit hooks 19665 1727204151.89032: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204151.2821827-19849-121249443055880/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204151.89035: _low_level_execute_command(): starting 19665 1727204151.89037: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204151.2821827-19849-121249443055880/ > /dev/null 2>&1 && sleep 0' 19665 1727204151.91187: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204151.91205: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204151.91220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204151.91289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204151.91338: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204151.91378: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204151.91395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.91413: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204151.91427: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204151.91485: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204151.91502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204151.91515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204151.91532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204151.91546: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204151.91556: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204151.91570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204151.91721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204151.91768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204151.91786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204151.91863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204151.93765: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204151.93771: stdout chunk (state=3): >>><<< 19665 1727204151.93774: stderr chunk (state=3): >>><<< 19665 1727204151.94080: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204151.94084: handler run complete 19665 1727204151.94086: attempt loop complete, returning result 19665 1727204151.94088: _execute() done 19665 1727204151.94091: dumping result to json 19665 1727204151.94093: done dumping result, returning 19665 1727204151.94095: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [0affcd87-79f5-0dcc-3ea6-000000000091] 19665 1727204151.94098: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000091 19665 1727204151.94174: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000091 19665 1727204151.94179: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 19665 1727204151.94248: no more pending results, returning what we have 19665 1727204151.94252: results queue empty 19665 1727204151.94253: checking for any_errors_fatal 19665 1727204151.94262: done checking for any_errors_fatal 19665 1727204151.94263: checking for max_fail_percentage 19665 1727204151.94266: done checking for max_fail_percentage 19665 1727204151.94267: checking to see if all hosts have failed and the running result is not ok 19665 1727204151.94268: done checking to see if all hosts have failed 19665 1727204151.94269: getting the remaining hosts for this loop 19665 1727204151.94271: done getting the remaining hosts for this loop 19665 1727204151.94276: getting the next task for host managed-node3 19665 1727204151.94284: done getting next task for host managed-node3 19665 1727204151.94287: ^ task is: TASK: Set flag to indicate system is ostree 19665 1727204151.94291: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204151.94294: getting variables 19665 1727204151.94296: in VariableManager get_vars() 19665 1727204151.94326: Calling all_inventory to load vars for managed-node3 19665 1727204151.94329: Calling groups_inventory to load vars for managed-node3 19665 1727204151.94333: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204151.94344: Calling all_plugins_play to load vars for managed-node3 19665 1727204151.94347: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204151.94350: Calling groups_plugins_play to load vars for managed-node3 19665 1727204151.94541: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204151.94999: done with get_vars() 19665 1727204151.95009: done getting variables 19665 1727204151.95109: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:55:51 -0400 (0:00:00.773) 0:00:02.818 ***** 19665 1727204151.95142: entering _queue_task() for managed-node3/set_fact 19665 1727204151.95144: Creating lock for set_fact 19665 1727204151.95836: worker is 1 (out of 1 available) 19665 1727204151.95848: exiting _queue_task() for managed-node3/set_fact 19665 1727204151.95858: done queuing things up, now waiting for results queue to drain 19665 1727204151.95860: waiting for pending results... 19665 1727204151.96637: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 19665 1727204151.96845: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000092 19665 1727204151.96857: variable 'ansible_search_path' from source: unknown 19665 1727204151.96862: variable 'ansible_search_path' from source: unknown 19665 1727204151.96899: calling self._execute() 19665 1727204151.97157: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204151.97161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204151.97173: variable 'omit' from source: magic vars 19665 1727204151.98163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204151.98729: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204151.98924: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204151.98964: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204151.99105: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204151.99307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204151.99343: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204151.99372: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204151.99477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204151.99724: Evaluated conditional (not __network_is_ostree is defined): True 19665 1727204151.99777: variable 'omit' from source: magic vars 19665 1727204151.99865: variable 'omit' from source: magic vars 19665 1727204152.00982: variable '__ostree_booted_stat' from source: set_fact 19665 1727204152.01035: variable 'omit' from source: magic vars 19665 1727204152.01065: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204152.01096: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204152.01115: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204152.01132: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204152.01158: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204152.01189: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204152.01192: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.01195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.01282: Set connection var ansible_connection to ssh 19665 1727204152.01288: Set connection var ansible_shell_type to sh 19665 1727204152.01293: Set connection var ansible_timeout to 10 19665 1727204152.01299: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204152.01306: Set connection var ansible_shell_executable to /bin/sh 19665 1727204152.01313: Set connection var ansible_pipelining to False 19665 1727204152.01335: variable 'ansible_shell_executable' from source: unknown 19665 1727204152.01339: variable 'ansible_connection' from source: unknown 19665 1727204152.01343: variable 'ansible_module_compression' from source: unknown 19665 1727204152.01345: variable 'ansible_shell_type' from source: unknown 19665 1727204152.01348: variable 'ansible_shell_executable' from source: unknown 19665 1727204152.01350: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.01358: variable 'ansible_pipelining' from source: unknown 19665 1727204152.01361: variable 'ansible_timeout' from source: unknown 19665 1727204152.01365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.01452: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204152.01462: variable 'omit' from source: magic vars 19665 1727204152.01465: starting attempt loop 19665 1727204152.02481: running the handler 19665 1727204152.02491: handler run complete 19665 1727204152.02502: attempt loop complete, returning result 19665 1727204152.02505: _execute() done 19665 1727204152.02508: dumping result to json 19665 1727204152.02510: done dumping result, returning 19665 1727204152.02519: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [0affcd87-79f5-0dcc-3ea6-000000000092] 19665 1727204152.02524: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000092 19665 1727204152.02625: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000092 19665 1727204152.02628: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 19665 1727204152.02683: no more pending results, returning what we have 19665 1727204152.02686: results queue empty 19665 1727204152.02687: checking for any_errors_fatal 19665 1727204152.02693: done checking for any_errors_fatal 19665 1727204152.02693: checking for max_fail_percentage 19665 1727204152.02695: done checking for max_fail_percentage 19665 1727204152.02696: checking to see if all hosts have failed and the running result is not ok 19665 1727204152.02697: done checking to see if all hosts have failed 19665 1727204152.02697: getting the remaining hosts for this loop 19665 1727204152.02699: done getting the remaining hosts for this loop 19665 1727204152.02703: getting the next task for host managed-node3 19665 1727204152.02711: done getting next task for host managed-node3 19665 1727204152.02714: ^ task is: TASK: Fix CentOS6 Base repo 19665 1727204152.02717: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204152.02720: getting variables 19665 1727204152.02722: in VariableManager get_vars() 19665 1727204152.02749: Calling all_inventory to load vars for managed-node3 19665 1727204152.02751: Calling groups_inventory to load vars for managed-node3 19665 1727204152.02754: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.02766: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.02768: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.02777: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.02947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.03159: done with get_vars() 19665 1727204152.03173: done getting variables 19665 1727204152.03348: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.082) 0:00:02.900 ***** 19665 1727204152.03378: entering _queue_task() for managed-node3/copy 19665 1727204152.04184: worker is 1 (out of 1 available) 19665 1727204152.04200: exiting _queue_task() for managed-node3/copy 19665 1727204152.04211: done queuing things up, now waiting for results queue to drain 19665 1727204152.04213: waiting for pending results... 19665 1727204152.05126: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 19665 1727204152.05219: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000094 19665 1727204152.05229: variable 'ansible_search_path' from source: unknown 19665 1727204152.05233: variable 'ansible_search_path' from source: unknown 19665 1727204152.05275: calling self._execute() 19665 1727204152.05346: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.05351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.05361: variable 'omit' from source: magic vars 19665 1727204152.07905: variable 'ansible_distribution' from source: facts 19665 1727204152.07928: Evaluated conditional (ansible_distribution == 'CentOS'): True 19665 1727204152.08063: variable 'ansible_distribution_major_version' from source: facts 19665 1727204152.08070: Evaluated conditional (ansible_distribution_major_version == '6'): False 19665 1727204152.08073: when evaluation is False, skipping this task 19665 1727204152.08076: _execute() done 19665 1727204152.08080: dumping result to json 19665 1727204152.08082: done dumping result, returning 19665 1727204152.08090: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [0affcd87-79f5-0dcc-3ea6-000000000094] 19665 1727204152.08095: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000094 19665 1727204152.08197: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000094 19665 1727204152.08199: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 19665 1727204152.08259: no more pending results, returning what we have 19665 1727204152.08263: results queue empty 19665 1727204152.08265: checking for any_errors_fatal 19665 1727204152.08271: done checking for any_errors_fatal 19665 1727204152.08272: checking for max_fail_percentage 19665 1727204152.08274: done checking for max_fail_percentage 19665 1727204152.08274: checking to see if all hosts have failed and the running result is not ok 19665 1727204152.08275: done checking to see if all hosts have failed 19665 1727204152.08276: getting the remaining hosts for this loop 19665 1727204152.08278: done getting the remaining hosts for this loop 19665 1727204152.08283: getting the next task for host managed-node3 19665 1727204152.08290: done getting next task for host managed-node3 19665 1727204152.08293: ^ task is: TASK: Include the task 'enable_epel.yml' 19665 1727204152.08296: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204152.08299: getting variables 19665 1727204152.08301: in VariableManager get_vars() 19665 1727204152.08384: Calling all_inventory to load vars for managed-node3 19665 1727204152.08387: Calling groups_inventory to load vars for managed-node3 19665 1727204152.08391: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.08405: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.08409: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.08412: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.08598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.08806: done with get_vars() 19665 1727204152.08817: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.055) 0:00:02.956 ***** 19665 1727204152.09025: entering _queue_task() for managed-node3/include_tasks 19665 1727204152.09607: worker is 1 (out of 1 available) 19665 1727204152.09620: exiting _queue_task() for managed-node3/include_tasks 19665 1727204152.09632: done queuing things up, now waiting for results queue to drain 19665 1727204152.09633: waiting for pending results... 19665 1727204152.10502: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 19665 1727204152.10779: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000095 19665 1727204152.10815: variable 'ansible_search_path' from source: unknown 19665 1727204152.10884: variable 'ansible_search_path' from source: unknown 19665 1727204152.10924: calling self._execute() 19665 1727204152.11061: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.11155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.11171: variable 'omit' from source: magic vars 19665 1727204152.13360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204152.19382: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204152.19638: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204152.19690: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204152.19828: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204152.19869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204152.20024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204152.20137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204152.20181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204152.20229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204152.20255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204152.21477: variable '__network_is_ostree' from source: set_fact 19665 1727204152.21536: Evaluated conditional (not __network_is_ostree | d(false)): True 19665 1727204152.21548: _execute() done 19665 1727204152.21556: dumping result to json 19665 1727204152.21563: done dumping result, returning 19665 1727204152.21575: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [0affcd87-79f5-0dcc-3ea6-000000000095] 19665 1727204152.21586: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000095 19665 1727204152.21794: no more pending results, returning what we have 19665 1727204152.21800: in VariableManager get_vars() 19665 1727204152.21863: Calling all_inventory to load vars for managed-node3 19665 1727204152.21869: Calling groups_inventory to load vars for managed-node3 19665 1727204152.21873: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.21885: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.21888: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.21892: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.22161: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.22372: done with get_vars() 19665 1727204152.22382: variable 'ansible_search_path' from source: unknown 19665 1727204152.22383: variable 'ansible_search_path' from source: unknown 19665 1727204152.22435: we have included files to process 19665 1727204152.22439: generating all_blocks data 19665 1727204152.22441: done generating all_blocks data 19665 1727204152.22448: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 19665 1727204152.22450: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 19665 1727204152.22453: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 19665 1727204152.23123: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000095 19665 1727204152.23127: WORKER PROCESS EXITING 19665 1727204152.23843: done processing included file 19665 1727204152.23845: iterating over new_blocks loaded from include file 19665 1727204152.23847: in VariableManager get_vars() 19665 1727204152.23860: done with get_vars() 19665 1727204152.23862: filtering new block on tags 19665 1727204152.23887: done filtering new block on tags 19665 1727204152.23891: in VariableManager get_vars() 19665 1727204152.23902: done with get_vars() 19665 1727204152.23904: filtering new block on tags 19665 1727204152.23916: done filtering new block on tags 19665 1727204152.23922: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 19665 1727204152.23932: extending task lists for all hosts with included blocks 19665 1727204152.24050: done extending task lists 19665 1727204152.24051: done processing included files 19665 1727204152.24052: results queue empty 19665 1727204152.24053: checking for any_errors_fatal 19665 1727204152.24057: done checking for any_errors_fatal 19665 1727204152.24058: checking for max_fail_percentage 19665 1727204152.24059: done checking for max_fail_percentage 19665 1727204152.24059: checking to see if all hosts have failed and the running result is not ok 19665 1727204152.24060: done checking to see if all hosts have failed 19665 1727204152.24061: getting the remaining hosts for this loop 19665 1727204152.24062: done getting the remaining hosts for this loop 19665 1727204152.24066: getting the next task for host managed-node3 19665 1727204152.24071: done getting next task for host managed-node3 19665 1727204152.24073: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 19665 1727204152.24075: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204152.24077: getting variables 19665 1727204152.24078: in VariableManager get_vars() 19665 1727204152.24086: Calling all_inventory to load vars for managed-node3 19665 1727204152.24088: Calling groups_inventory to load vars for managed-node3 19665 1727204152.24090: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.24095: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.24103: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.24106: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.24288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.24611: done with get_vars() 19665 1727204152.24620: done getting variables 19665 1727204152.24785: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 19665 1727204152.25364: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 9] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.164) 0:00:03.120 ***** 19665 1727204152.25414: entering _queue_task() for managed-node3/command 19665 1727204152.25416: Creating lock for command 19665 1727204152.26079: worker is 1 (out of 1 available) 19665 1727204152.26093: exiting _queue_task() for managed-node3/command 19665 1727204152.26185: done queuing things up, now waiting for results queue to drain 19665 1727204152.26192: waiting for pending results... 19665 1727204152.26514: running TaskExecutor() for managed-node3/TASK: Create EPEL 9 19665 1727204152.26649: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000000af 19665 1727204152.26673: variable 'ansible_search_path' from source: unknown 19665 1727204152.26679: variable 'ansible_search_path' from source: unknown 19665 1727204152.26715: calling self._execute() 19665 1727204152.26802: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.26812: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.26868: variable 'omit' from source: magic vars 19665 1727204152.27271: variable 'ansible_distribution' from source: facts 19665 1727204152.27290: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19665 1727204152.27442: variable 'ansible_distribution_major_version' from source: facts 19665 1727204152.27452: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 19665 1727204152.27458: when evaluation is False, skipping this task 19665 1727204152.27463: _execute() done 19665 1727204152.27472: dumping result to json 19665 1727204152.27478: done dumping result, returning 19665 1727204152.27487: done running TaskExecutor() for managed-node3/TASK: Create EPEL 9 [0affcd87-79f5-0dcc-3ea6-0000000000af] 19665 1727204152.27496: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000af skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 19665 1727204152.27716: no more pending results, returning what we have 19665 1727204152.27720: results queue empty 19665 1727204152.27722: checking for any_errors_fatal 19665 1727204152.27723: done checking for any_errors_fatal 19665 1727204152.27724: checking for max_fail_percentage 19665 1727204152.27726: done checking for max_fail_percentage 19665 1727204152.27726: checking to see if all hosts have failed and the running result is not ok 19665 1727204152.27727: done checking to see if all hosts have failed 19665 1727204152.27728: getting the remaining hosts for this loop 19665 1727204152.27730: done getting the remaining hosts for this loop 19665 1727204152.27735: getting the next task for host managed-node3 19665 1727204152.27746: done getting next task for host managed-node3 19665 1727204152.27749: ^ task is: TASK: Install yum-utils package 19665 1727204152.27753: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204152.27757: getting variables 19665 1727204152.27759: in VariableManager get_vars() 19665 1727204152.27793: Calling all_inventory to load vars for managed-node3 19665 1727204152.27796: Calling groups_inventory to load vars for managed-node3 19665 1727204152.27800: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.27813: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.27817: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.27820: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.28162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.28380: done with get_vars() 19665 1727204152.28399: done getting variables 19665 1727204152.28654: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 19665 1727204152.28729: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000af 19665 1727204152.28733: WORKER PROCESS EXITING TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.033) 0:00:03.154 ***** 19665 1727204152.28751: entering _queue_task() for managed-node3/package 19665 1727204152.28753: Creating lock for package 19665 1727204152.29613: worker is 1 (out of 1 available) 19665 1727204152.29693: exiting _queue_task() for managed-node3/package 19665 1727204152.29707: done queuing things up, now waiting for results queue to drain 19665 1727204152.29709: waiting for pending results... 19665 1727204152.30498: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 19665 1727204152.30626: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000000b0 19665 1727204152.30646: variable 'ansible_search_path' from source: unknown 19665 1727204152.30654: variable 'ansible_search_path' from source: unknown 19665 1727204152.30696: calling self._execute() 19665 1727204152.30883: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.30895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.30907: variable 'omit' from source: magic vars 19665 1727204152.31276: variable 'ansible_distribution' from source: facts 19665 1727204152.31295: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19665 1727204152.31431: variable 'ansible_distribution_major_version' from source: facts 19665 1727204152.31443: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 19665 1727204152.31451: when evaluation is False, skipping this task 19665 1727204152.31458: _execute() done 19665 1727204152.31469: dumping result to json 19665 1727204152.31477: done dumping result, returning 19665 1727204152.31494: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [0affcd87-79f5-0dcc-3ea6-0000000000b0] 19665 1727204152.31505: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000b0 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 19665 1727204152.31735: no more pending results, returning what we have 19665 1727204152.31739: results queue empty 19665 1727204152.31740: checking for any_errors_fatal 19665 1727204152.31748: done checking for any_errors_fatal 19665 1727204152.31749: checking for max_fail_percentage 19665 1727204152.31750: done checking for max_fail_percentage 19665 1727204152.31751: checking to see if all hosts have failed and the running result is not ok 19665 1727204152.31752: done checking to see if all hosts have failed 19665 1727204152.31752: getting the remaining hosts for this loop 19665 1727204152.31754: done getting the remaining hosts for this loop 19665 1727204152.31759: getting the next task for host managed-node3 19665 1727204152.31767: done getting next task for host managed-node3 19665 1727204152.31770: ^ task is: TASK: Enable EPEL 7 19665 1727204152.31774: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204152.31777: getting variables 19665 1727204152.31779: in VariableManager get_vars() 19665 1727204152.31804: Calling all_inventory to load vars for managed-node3 19665 1727204152.31806: Calling groups_inventory to load vars for managed-node3 19665 1727204152.31811: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.31825: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.31829: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.31832: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.32005: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.32207: done with get_vars() 19665 1727204152.32217: done getting variables 19665 1727204152.32285: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000b0 19665 1727204152.32297: WORKER PROCESS EXITING 19665 1727204152.32429: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.037) 0:00:03.191 ***** 19665 1727204152.32515: entering _queue_task() for managed-node3/command 19665 1727204152.32990: worker is 1 (out of 1 available) 19665 1727204152.33003: exiting _queue_task() for managed-node3/command 19665 1727204152.33015: done queuing things up, now waiting for results queue to drain 19665 1727204152.33017: waiting for pending results... 19665 1727204152.33974: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 19665 1727204152.34347: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000000b1 19665 1727204152.34369: variable 'ansible_search_path' from source: unknown 19665 1727204152.34391: variable 'ansible_search_path' from source: unknown 19665 1727204152.34413: calling self._execute() 19665 1727204152.34517: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.34529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.34554: variable 'omit' from source: magic vars 19665 1727204152.34968: variable 'ansible_distribution' from source: facts 19665 1727204152.34992: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19665 1727204152.35137: variable 'ansible_distribution_major_version' from source: facts 19665 1727204152.35149: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 19665 1727204152.35158: when evaluation is False, skipping this task 19665 1727204152.35168: _execute() done 19665 1727204152.35176: dumping result to json 19665 1727204152.35184: done dumping result, returning 19665 1727204152.35200: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [0affcd87-79f5-0dcc-3ea6-0000000000b1] 19665 1727204152.35210: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000b1 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 19665 1727204152.35371: no more pending results, returning what we have 19665 1727204152.35375: results queue empty 19665 1727204152.35377: checking for any_errors_fatal 19665 1727204152.35386: done checking for any_errors_fatal 19665 1727204152.35387: checking for max_fail_percentage 19665 1727204152.35389: done checking for max_fail_percentage 19665 1727204152.35390: checking to see if all hosts have failed and the running result is not ok 19665 1727204152.35391: done checking to see if all hosts have failed 19665 1727204152.35392: getting the remaining hosts for this loop 19665 1727204152.35394: done getting the remaining hosts for this loop 19665 1727204152.35398: getting the next task for host managed-node3 19665 1727204152.35407: done getting next task for host managed-node3 19665 1727204152.35410: ^ task is: TASK: Enable EPEL 8 19665 1727204152.35415: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204152.35418: getting variables 19665 1727204152.35420: in VariableManager get_vars() 19665 1727204152.35452: Calling all_inventory to load vars for managed-node3 19665 1727204152.35455: Calling groups_inventory to load vars for managed-node3 19665 1727204152.35459: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.35475: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.35480: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.35484: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.35680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.35923: done with get_vars() 19665 1727204152.35935: done getting variables 19665 1727204152.36116: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000b1 19665 1727204152.36119: WORKER PROCESS EXITING 19665 1727204152.36152: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.037) 0:00:03.229 ***** 19665 1727204152.36294: entering _queue_task() for managed-node3/command 19665 1727204152.36740: worker is 1 (out of 1 available) 19665 1727204152.36752: exiting _queue_task() for managed-node3/command 19665 1727204152.36766: done queuing things up, now waiting for results queue to drain 19665 1727204152.36768: waiting for pending results... 19665 1727204152.37031: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 19665 1727204152.37147: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000000b2 19665 1727204152.37166: variable 'ansible_search_path' from source: unknown 19665 1727204152.37175: variable 'ansible_search_path' from source: unknown 19665 1727204152.37222: calling self._execute() 19665 1727204152.37306: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.37325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.37339: variable 'omit' from source: magic vars 19665 1727204152.37976: variable 'ansible_distribution' from source: facts 19665 1727204152.37996: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19665 1727204152.38145: variable 'ansible_distribution_major_version' from source: facts 19665 1727204152.38233: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 19665 1727204152.38242: when evaluation is False, skipping this task 19665 1727204152.38249: _execute() done 19665 1727204152.38256: dumping result to json 19665 1727204152.38263: done dumping result, returning 19665 1727204152.38278: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [0affcd87-79f5-0dcc-3ea6-0000000000b2] 19665 1727204152.38289: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000b2 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 19665 1727204152.38448: no more pending results, returning what we have 19665 1727204152.38452: results queue empty 19665 1727204152.38453: checking for any_errors_fatal 19665 1727204152.38459: done checking for any_errors_fatal 19665 1727204152.38460: checking for max_fail_percentage 19665 1727204152.38462: done checking for max_fail_percentage 19665 1727204152.38463: checking to see if all hosts have failed and the running result is not ok 19665 1727204152.38466: done checking to see if all hosts have failed 19665 1727204152.38467: getting the remaining hosts for this loop 19665 1727204152.38469: done getting the remaining hosts for this loop 19665 1727204152.38473: getting the next task for host managed-node3 19665 1727204152.38485: done getting next task for host managed-node3 19665 1727204152.38488: ^ task is: TASK: Enable EPEL 6 19665 1727204152.38492: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204152.38496: getting variables 19665 1727204152.38498: in VariableManager get_vars() 19665 1727204152.38530: Calling all_inventory to load vars for managed-node3 19665 1727204152.38533: Calling groups_inventory to load vars for managed-node3 19665 1727204152.38537: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.38551: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.38555: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.38559: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.38751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.38959: done with get_vars() 19665 1727204152.38973: done getting variables 19665 1727204152.39237: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204152.39343: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000b2 19665 1727204152.39347: WORKER PROCESS EXITING TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.030) 0:00:03.260 ***** 19665 1727204152.39363: entering _queue_task() for managed-node3/copy 19665 1727204152.40152: worker is 1 (out of 1 available) 19665 1727204152.40169: exiting _queue_task() for managed-node3/copy 19665 1727204152.40182: done queuing things up, now waiting for results queue to drain 19665 1727204152.40183: waiting for pending results... 19665 1727204152.40935: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 19665 1727204152.41050: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000000b4 19665 1727204152.41070: variable 'ansible_search_path' from source: unknown 19665 1727204152.41077: variable 'ansible_search_path' from source: unknown 19665 1727204152.41125: calling self._execute() 19665 1727204152.41212: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.41226: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.41240: variable 'omit' from source: magic vars 19665 1727204152.41638: variable 'ansible_distribution' from source: facts 19665 1727204152.41659: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 19665 1727204152.41788: variable 'ansible_distribution_major_version' from source: facts 19665 1727204152.41801: Evaluated conditional (ansible_distribution_major_version == '6'): False 19665 1727204152.41809: when evaluation is False, skipping this task 19665 1727204152.41816: _execute() done 19665 1727204152.41823: dumping result to json 19665 1727204152.41830: done dumping result, returning 19665 1727204152.41843: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [0affcd87-79f5-0dcc-3ea6-0000000000b4] 19665 1727204152.41858: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000b4 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 19665 1727204152.42018: no more pending results, returning what we have 19665 1727204152.42022: results queue empty 19665 1727204152.42024: checking for any_errors_fatal 19665 1727204152.42031: done checking for any_errors_fatal 19665 1727204152.42032: checking for max_fail_percentage 19665 1727204152.42034: done checking for max_fail_percentage 19665 1727204152.42035: checking to see if all hosts have failed and the running result is not ok 19665 1727204152.42036: done checking to see if all hosts have failed 19665 1727204152.42037: getting the remaining hosts for this loop 19665 1727204152.42039: done getting the remaining hosts for this loop 19665 1727204152.42044: getting the next task for host managed-node3 19665 1727204152.42054: done getting next task for host managed-node3 19665 1727204152.42058: ^ task is: TASK: Set network provider to 'nm' 19665 1727204152.42060: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204152.42066: getting variables 19665 1727204152.42069: in VariableManager get_vars() 19665 1727204152.42103: Calling all_inventory to load vars for managed-node3 19665 1727204152.42106: Calling groups_inventory to load vars for managed-node3 19665 1727204152.42110: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.42124: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.42128: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.42131: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.42380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.42812: done with get_vars() 19665 1727204152.42826: done getting variables 19665 1727204152.43018: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204152.43100: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000b4 19665 1727204152.43104: WORKER PROCESS EXITING TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:13 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.037) 0:00:03.297 ***** 19665 1727204152.43126: entering _queue_task() for managed-node3/set_fact 19665 1727204152.43651: worker is 1 (out of 1 available) 19665 1727204152.43780: exiting _queue_task() for managed-node3/set_fact 19665 1727204152.43793: done queuing things up, now waiting for results queue to drain 19665 1727204152.43795: waiting for pending results... 19665 1727204152.44912: running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' 19665 1727204152.45034: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000007 19665 1727204152.45055: variable 'ansible_search_path' from source: unknown 19665 1727204152.45099: calling self._execute() 19665 1727204152.45188: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.45255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.45368: variable 'omit' from source: magic vars 19665 1727204152.45591: variable 'omit' from source: magic vars 19665 1727204152.45627: variable 'omit' from source: magic vars 19665 1727204152.45672: variable 'omit' from source: magic vars 19665 1727204152.45820: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204152.45867: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204152.46008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204152.46033: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204152.46052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204152.46091: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204152.46094: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.46097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.46310: Set connection var ansible_connection to ssh 19665 1727204152.46325: Set connection var ansible_shell_type to sh 19665 1727204152.46343: Set connection var ansible_timeout to 10 19665 1727204152.46453: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204152.46472: Set connection var ansible_shell_executable to /bin/sh 19665 1727204152.46485: Set connection var ansible_pipelining to False 19665 1727204152.46514: variable 'ansible_shell_executable' from source: unknown 19665 1727204152.46523: variable 'ansible_connection' from source: unknown 19665 1727204152.46526: variable 'ansible_module_compression' from source: unknown 19665 1727204152.46529: variable 'ansible_shell_type' from source: unknown 19665 1727204152.46533: variable 'ansible_shell_executable' from source: unknown 19665 1727204152.46536: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.46542: variable 'ansible_pipelining' from source: unknown 19665 1727204152.46553: variable 'ansible_timeout' from source: unknown 19665 1727204152.46611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.46746: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204152.46764: variable 'omit' from source: magic vars 19665 1727204152.46777: starting attempt loop 19665 1727204152.46783: running the handler 19665 1727204152.46797: handler run complete 19665 1727204152.46959: attempt loop complete, returning result 19665 1727204152.46969: _execute() done 19665 1727204152.46976: dumping result to json 19665 1727204152.46985: done dumping result, returning 19665 1727204152.47000: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' [0affcd87-79f5-0dcc-3ea6-000000000007] 19665 1727204152.47009: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000007 ok: [managed-node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 19665 1727204152.47178: no more pending results, returning what we have 19665 1727204152.47181: results queue empty 19665 1727204152.47183: checking for any_errors_fatal 19665 1727204152.47189: done checking for any_errors_fatal 19665 1727204152.47190: checking for max_fail_percentage 19665 1727204152.47192: done checking for max_fail_percentage 19665 1727204152.47193: checking to see if all hosts have failed and the running result is not ok 19665 1727204152.47194: done checking to see if all hosts have failed 19665 1727204152.47195: getting the remaining hosts for this loop 19665 1727204152.47198: done getting the remaining hosts for this loop 19665 1727204152.47202: getting the next task for host managed-node3 19665 1727204152.47211: done getting next task for host managed-node3 19665 1727204152.47214: ^ task is: TASK: meta (flush_handlers) 19665 1727204152.47217: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204152.47223: getting variables 19665 1727204152.47225: in VariableManager get_vars() 19665 1727204152.47261: Calling all_inventory to load vars for managed-node3 19665 1727204152.47269: Calling groups_inventory to load vars for managed-node3 19665 1727204152.47273: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.47286: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.47289: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.47292: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.47522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.47752: done with get_vars() 19665 1727204152.47766: done getting variables 19665 1727204152.47948: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000007 19665 1727204152.47951: WORKER PROCESS EXITING 19665 1727204152.47976: in VariableManager get_vars() 19665 1727204152.47988: Calling all_inventory to load vars for managed-node3 19665 1727204152.47990: Calling groups_inventory to load vars for managed-node3 19665 1727204152.47993: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.47998: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.48000: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.48003: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.48469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.48705: done with get_vars() 19665 1727204152.48720: done queuing things up, now waiting for results queue to drain 19665 1727204152.48722: results queue empty 19665 1727204152.48723: checking for any_errors_fatal 19665 1727204152.48726: done checking for any_errors_fatal 19665 1727204152.48726: checking for max_fail_percentage 19665 1727204152.48728: done checking for max_fail_percentage 19665 1727204152.48728: checking to see if all hosts have failed and the running result is not ok 19665 1727204152.48729: done checking to see if all hosts have failed 19665 1727204152.48730: getting the remaining hosts for this loop 19665 1727204152.48731: done getting the remaining hosts for this loop 19665 1727204152.48733: getting the next task for host managed-node3 19665 1727204152.48740: done getting next task for host managed-node3 19665 1727204152.48742: ^ task is: TASK: meta (flush_handlers) 19665 1727204152.48744: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204152.48752: getting variables 19665 1727204152.48753: in VariableManager get_vars() 19665 1727204152.48762: Calling all_inventory to load vars for managed-node3 19665 1727204152.48766: Calling groups_inventory to load vars for managed-node3 19665 1727204152.48768: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.48773: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.48775: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.48778: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.48926: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.49114: done with get_vars() 19665 1727204152.49126: done getting variables 19665 1727204152.49174: in VariableManager get_vars() 19665 1727204152.49183: Calling all_inventory to load vars for managed-node3 19665 1727204152.49186: Calling groups_inventory to load vars for managed-node3 19665 1727204152.49188: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.49192: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.49194: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.49197: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.49327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.49522: done with get_vars() 19665 1727204152.49538: done queuing things up, now waiting for results queue to drain 19665 1727204152.49540: results queue empty 19665 1727204152.49540: checking for any_errors_fatal 19665 1727204152.49542: done checking for any_errors_fatal 19665 1727204152.49542: checking for max_fail_percentage 19665 1727204152.49543: done checking for max_fail_percentage 19665 1727204152.49544: checking to see if all hosts have failed and the running result is not ok 19665 1727204152.49545: done checking to see if all hosts have failed 19665 1727204152.49546: getting the remaining hosts for this loop 19665 1727204152.49547: done getting the remaining hosts for this loop 19665 1727204152.49549: getting the next task for host managed-node3 19665 1727204152.49552: done getting next task for host managed-node3 19665 1727204152.49553: ^ task is: None 19665 1727204152.49559: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204152.49560: done queuing things up, now waiting for results queue to drain 19665 1727204152.49561: results queue empty 19665 1727204152.49561: checking for any_errors_fatal 19665 1727204152.49562: done checking for any_errors_fatal 19665 1727204152.49563: checking for max_fail_percentage 19665 1727204152.49564: done checking for max_fail_percentage 19665 1727204152.49564: checking to see if all hosts have failed and the running result is not ok 19665 1727204152.49566: done checking to see if all hosts have failed 19665 1727204152.49569: getting the next task for host managed-node3 19665 1727204152.49571: done getting next task for host managed-node3 19665 1727204152.49572: ^ task is: None 19665 1727204152.49573: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204152.49626: in VariableManager get_vars() 19665 1727204152.49643: done with get_vars() 19665 1727204152.49649: in VariableManager get_vars() 19665 1727204152.49658: done with get_vars() 19665 1727204152.49666: variable 'omit' from source: magic vars 19665 1727204152.49699: in VariableManager get_vars() 19665 1727204152.49709: done with get_vars() 19665 1727204152.49731: variable 'omit' from source: magic vars PLAY [Test configuring bridges] ************************************************ 19665 1727204152.49984: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19665 1727204152.50017: getting the remaining hosts for this loop 19665 1727204152.50018: done getting the remaining hosts for this loop 19665 1727204152.50021: getting the next task for host managed-node3 19665 1727204152.50024: done getting next task for host managed-node3 19665 1727204152.50026: ^ task is: TASK: Gathering Facts 19665 1727204152.50027: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204152.50029: getting variables 19665 1727204152.50030: in VariableManager get_vars() 19665 1727204152.50041: Calling all_inventory to load vars for managed-node3 19665 1727204152.50043: Calling groups_inventory to load vars for managed-node3 19665 1727204152.50045: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204152.50050: Calling all_plugins_play to load vars for managed-node3 19665 1727204152.50066: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204152.50069: Calling groups_plugins_play to load vars for managed-node3 19665 1727204152.50213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204152.50402: done with get_vars() 19665 1727204152.50410: done getting variables 19665 1727204152.50456: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 Tuesday 24 September 2024 14:55:52 -0400 (0:00:00.073) 0:00:03.371 ***** 19665 1727204152.50486: entering _queue_task() for managed-node3/gather_facts 19665 1727204152.50782: worker is 1 (out of 1 available) 19665 1727204152.50800: exiting _queue_task() for managed-node3/gather_facts 19665 1727204152.50813: done queuing things up, now waiting for results queue to drain 19665 1727204152.50816: waiting for pending results... 19665 1727204152.51081: running TaskExecutor() for managed-node3/TASK: Gathering Facts 19665 1727204152.51191: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000000da 19665 1727204152.51209: variable 'ansible_search_path' from source: unknown 19665 1727204152.51260: calling self._execute() 19665 1727204152.51345: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.51356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.51377: variable 'omit' from source: magic vars 19665 1727204152.51777: variable 'ansible_distribution_major_version' from source: facts 19665 1727204152.51799: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204152.51809: variable 'omit' from source: magic vars 19665 1727204152.51840: variable 'omit' from source: magic vars 19665 1727204152.51886: variable 'omit' from source: magic vars 19665 1727204152.51934: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204152.51979: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204152.52015: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204152.52041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204152.52058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204152.52100: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204152.52108: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.52117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.52224: Set connection var ansible_connection to ssh 19665 1727204152.52245: Set connection var ansible_shell_type to sh 19665 1727204152.52257: Set connection var ansible_timeout to 10 19665 1727204152.52270: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204152.52282: Set connection var ansible_shell_executable to /bin/sh 19665 1727204152.52291: Set connection var ansible_pipelining to False 19665 1727204152.52319: variable 'ansible_shell_executable' from source: unknown 19665 1727204152.52326: variable 'ansible_connection' from source: unknown 19665 1727204152.52331: variable 'ansible_module_compression' from source: unknown 19665 1727204152.52339: variable 'ansible_shell_type' from source: unknown 19665 1727204152.52346: variable 'ansible_shell_executable' from source: unknown 19665 1727204152.52351: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204152.52357: variable 'ansible_pipelining' from source: unknown 19665 1727204152.52361: variable 'ansible_timeout' from source: unknown 19665 1727204152.52370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204152.52567: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204152.52586: variable 'omit' from source: magic vars 19665 1727204152.52596: starting attempt loop 19665 1727204152.52603: running the handler 19665 1727204152.52622: variable 'ansible_facts' from source: unknown 19665 1727204152.52655: _low_level_execute_command(): starting 19665 1727204152.52674: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204152.53647: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204152.53671: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204152.53689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204152.53711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204152.53802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204152.53900: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204152.53916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204152.53940: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204152.53953: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204152.53967: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204152.53982: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204152.54003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204152.54022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204152.54035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204152.54051: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204152.54068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204152.54156: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204152.54184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204152.54200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204152.54300: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204152.56441: stdout chunk (state=3): >>>/root <<< 19665 1727204152.56698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204152.56703: stdout chunk (state=3): >>><<< 19665 1727204152.56711: stderr chunk (state=3): >>><<< 19665 1727204152.56844: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19665 1727204152.56848: _low_level_execute_command(): starting 19665 1727204152.56851: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204152.567351-19915-181142935375202 `" && echo ansible-tmp-1727204152.567351-19915-181142935375202="` echo /root/.ansible/tmp/ansible-tmp-1727204152.567351-19915-181142935375202 `" ) && sleep 0' 19665 1727204152.57513: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204152.57534: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204152.57555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204152.57577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204152.57625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204152.57644: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204152.57660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204152.57682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204152.57695: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204152.57707: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204152.57720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204152.57734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204152.57758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204152.57774: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204152.57786: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204152.57800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204152.57886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204152.57910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204152.57928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204152.58020: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204152.60607: stdout chunk (state=3): >>>ansible-tmp-1727204152.567351-19915-181142935375202=/root/.ansible/tmp/ansible-tmp-1727204152.567351-19915-181142935375202 <<< 19665 1727204152.60794: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204152.60910: stderr chunk (state=3): >>><<< 19665 1727204152.60922: stdout chunk (state=3): >>><<< 19665 1727204152.61078: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204152.567351-19915-181142935375202=/root/.ansible/tmp/ansible-tmp-1727204152.567351-19915-181142935375202 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19665 1727204152.61083: variable 'ansible_module_compression' from source: unknown 19665 1727204152.61085: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19665 1727204152.61187: variable 'ansible_facts' from source: unknown 19665 1727204152.61641: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204152.567351-19915-181142935375202/AnsiballZ_setup.py 19665 1727204152.61809: Sending initial data 19665 1727204152.61812: Sent initial data (153 bytes) 19665 1727204152.62812: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204152.62829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204152.62846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204152.62865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204152.62911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204152.62922: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204152.62935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204152.62954: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204152.62968: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204152.62979: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204152.62991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204152.63004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204152.63019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204152.63030: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204152.63044: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204152.63058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204152.63140: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204152.63163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204152.63187: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204152.63272: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204152.65838: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204152.65881: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204152.65925: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmprl_erd__ /root/.ansible/tmp/ansible-tmp-1727204152.567351-19915-181142935375202/AnsiballZ_setup.py <<< 19665 1727204152.65974: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204152.68663: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204152.68888: stderr chunk (state=3): >>><<< 19665 1727204152.68892: stdout chunk (state=3): >>><<< 19665 1727204152.68894: done transferring module to remote 19665 1727204152.68897: _low_level_execute_command(): starting 19665 1727204152.68903: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204152.567351-19915-181142935375202/ /root/.ansible/tmp/ansible-tmp-1727204152.567351-19915-181142935375202/AnsiballZ_setup.py && sleep 0' 19665 1727204152.69738: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204152.69777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204152.69807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204152.69832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204152.69908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204152.69922: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204152.69940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204152.69960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204152.69977: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204152.69995: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204152.70026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204152.70045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204152.70062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204152.70089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204152.70108: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204152.70130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204152.70223: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204152.70262: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204152.70282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204152.70369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204152.72893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204152.73028: stderr chunk (state=3): >>><<< 19665 1727204152.73067: stdout chunk (state=3): >>><<< 19665 1727204152.73230: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19665 1727204152.73234: _low_level_execute_command(): starting 19665 1727204152.73240: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204152.567351-19915-181142935375202/AnsiballZ_setup.py && sleep 0' 19665 1727204152.73933: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204152.73951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204152.73967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204152.73991: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204152.74045: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204152.74058: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204152.74093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204152.74122: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204152.74134: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204152.74150: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204152.74162: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204152.74180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204152.74196: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204152.74222: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204152.74235: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204152.74254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204152.74355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204152.74382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204152.74403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204152.74502: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204153.42609: stdout chunk (state=3): >>> <<< 19665 1727204153.42657: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "53", "epoch": "1727204153", "epoch_int": "1727204153", "date": "2024-09-24", "time": "14:55:53", "iso8601_micro": "2024-09-24T18:55:53.121086Z", "iso8601": "2024-09-24T18:55:53Z", "iso8601_basic": "20240924T145553121086", "iso8601_basic_short": "20240924T145553", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8<<< 19665 1727204153.42714: stdout chunk (state=3): >>>eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2814, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 718, "free": 2814}, "nocache": {"free": 3273, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_de<<< 19665 1727204153.42718: stdout chunk (state=3): >>>vice_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 499, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264281894912, "block_size": 4096, "block_total": 65519355, "block_available": 64521947, "block_used": 997408, "inode_total": 131071472, "inode_available": 130998308, "inode_used": 73164, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic"<<< 19665 1727204153.42783: stdout chunk (state=3): >>>: "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "h<<< 19665 1727204153.42799: stdout chunk (state=3): >>>sr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_loadavg": {"1m": 0.4, "5m": 0.35, "15m": 0.17}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19665 1727204153.45191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204153.45266: stderr chunk (state=3): >>><<< 19665 1727204153.45269: stdout chunk (state=3): >>><<< 19665 1727204153.45480: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "53", "epoch": "1727204153", "epoch_int": "1727204153", "date": "2024-09-24", "time": "14:55:53", "iso8601_micro": "2024-09-24T18:55:53.121086Z", "iso8601": "2024-09-24T18:55:53Z", "iso8601_basic": "20240924T145553121086", "iso8601_basic_short": "20240924T145553", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2814, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 718, "free": 2814}, "nocache": {"free": 3273, "used": 259}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 499, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264281894912, "block_size": 4096, "block_total": 65519355, "block_available": 64521947, "block_used": 997408, "inode_total": 131071472, "inode_available": 130998308, "inode_used": 73164, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_loadavg": {"1m": 0.4, "5m": 0.35, "15m": 0.17}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204153.45715: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204152.567351-19915-181142935375202/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204153.45742: _low_level_execute_command(): starting 19665 1727204153.45752: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204152.567351-19915-181142935375202/ > /dev/null 2>&1 && sleep 0' 19665 1727204153.46486: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204153.46502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204153.46516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204153.46533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204153.46587: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204153.46603: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204153.46617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204153.46634: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204153.46646: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204153.46657: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204153.46678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204153.46693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204153.46710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204153.46724: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204153.46734: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204153.46748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204153.46838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204153.46860: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204153.46879: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204153.46966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204153.49447: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204153.49451: stdout chunk (state=3): >>><<< 19665 1727204153.49453: stderr chunk (state=3): >>><<< 19665 1727204153.49678: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19665 1727204153.49682: handler run complete 19665 1727204153.49684: variable 'ansible_facts' from source: unknown 19665 1727204153.49804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204153.50140: variable 'ansible_facts' from source: unknown 19665 1727204153.50246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204153.50395: attempt loop complete, returning result 19665 1727204153.50404: _execute() done 19665 1727204153.50411: dumping result to json 19665 1727204153.50460: done dumping result, returning 19665 1727204153.50476: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-0dcc-3ea6-0000000000da] 19665 1727204153.50486: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000da ok: [managed-node3] 19665 1727204153.51105: no more pending results, returning what we have 19665 1727204153.51108: results queue empty 19665 1727204153.51109: checking for any_errors_fatal 19665 1727204153.51111: done checking for any_errors_fatal 19665 1727204153.51111: checking for max_fail_percentage 19665 1727204153.51113: done checking for max_fail_percentage 19665 1727204153.51114: checking to see if all hosts have failed and the running result is not ok 19665 1727204153.51115: done checking to see if all hosts have failed 19665 1727204153.51116: getting the remaining hosts for this loop 19665 1727204153.51117: done getting the remaining hosts for this loop 19665 1727204153.51121: getting the next task for host managed-node3 19665 1727204153.51129: done getting next task for host managed-node3 19665 1727204153.51131: ^ task is: TASK: meta (flush_handlers) 19665 1727204153.51133: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204153.51137: getting variables 19665 1727204153.51139: in VariableManager get_vars() 19665 1727204153.51195: Calling all_inventory to load vars for managed-node3 19665 1727204153.51198: Calling groups_inventory to load vars for managed-node3 19665 1727204153.51201: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204153.51213: Calling all_plugins_play to load vars for managed-node3 19665 1727204153.51217: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204153.51220: Calling groups_plugins_play to load vars for managed-node3 19665 1727204153.51461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204153.51780: done with get_vars() 19665 1727204153.51790: done getting variables 19665 1727204153.51952: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000da 19665 1727204153.51955: WORKER PROCESS EXITING 19665 1727204153.52002: in VariableManager get_vars() 19665 1727204153.52012: Calling all_inventory to load vars for managed-node3 19665 1727204153.52015: Calling groups_inventory to load vars for managed-node3 19665 1727204153.52017: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204153.52022: Calling all_plugins_play to load vars for managed-node3 19665 1727204153.52024: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204153.52032: Calling groups_plugins_play to load vars for managed-node3 19665 1727204153.52334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204153.52525: done with get_vars() 19665 1727204153.52535: done queuing things up, now waiting for results queue to drain 19665 1727204153.52538: results queue empty 19665 1727204153.52539: checking for any_errors_fatal 19665 1727204153.52541: done checking for any_errors_fatal 19665 1727204153.52542: checking for max_fail_percentage 19665 1727204153.52542: done checking for max_fail_percentage 19665 1727204153.52543: checking to see if all hosts have failed and the running result is not ok 19665 1727204153.52543: done checking to see if all hosts have failed 19665 1727204153.52544: getting the remaining hosts for this loop 19665 1727204153.52544: done getting the remaining hosts for this loop 19665 1727204153.52546: getting the next task for host managed-node3 19665 1727204153.52548: done getting next task for host managed-node3 19665 1727204153.52550: ^ task is: TASK: Set interface={{ interface }} 19665 1727204153.52551: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204153.52552: getting variables 19665 1727204153.52553: in VariableManager get_vars() 19665 1727204153.52558: Calling all_inventory to load vars for managed-node3 19665 1727204153.52559: Calling groups_inventory to load vars for managed-node3 19665 1727204153.52561: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204153.52566: Calling all_plugins_play to load vars for managed-node3 19665 1727204153.52568: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204153.52570: Calling groups_plugins_play to load vars for managed-node3 19665 1727204153.52656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204153.52766: done with get_vars() 19665 1727204153.52773: done getting variables 19665 1727204153.52801: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204153.52896: variable 'interface' from source: play vars TASK [Set interface=LSR-TST-br31] ********************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:9 Tuesday 24 September 2024 14:55:53 -0400 (0:00:01.024) 0:00:04.395 ***** 19665 1727204153.52925: entering _queue_task() for managed-node3/set_fact 19665 1727204153.53124: worker is 1 (out of 1 available) 19665 1727204153.53139: exiting _queue_task() for managed-node3/set_fact 19665 1727204153.53151: done queuing things up, now waiting for results queue to drain 19665 1727204153.53153: waiting for pending results... 19665 1727204153.53303: running TaskExecutor() for managed-node3/TASK: Set interface=LSR-TST-br31 19665 1727204153.53363: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000000b 19665 1727204153.53381: variable 'ansible_search_path' from source: unknown 19665 1727204153.53415: calling self._execute() 19665 1727204153.53542: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204153.53546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204153.53553: variable 'omit' from source: magic vars 19665 1727204153.53802: variable 'ansible_distribution_major_version' from source: facts 19665 1727204153.53811: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204153.53822: variable 'omit' from source: magic vars 19665 1727204153.53844: variable 'omit' from source: magic vars 19665 1727204153.53867: variable 'interface' from source: play vars 19665 1727204153.53918: variable 'interface' from source: play vars 19665 1727204153.53939: variable 'omit' from source: magic vars 19665 1727204153.53974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204153.54000: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204153.54016: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204153.54036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204153.54048: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204153.54073: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204153.54076: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204153.54078: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204153.54146: Set connection var ansible_connection to ssh 19665 1727204153.54151: Set connection var ansible_shell_type to sh 19665 1727204153.54161: Set connection var ansible_timeout to 10 19665 1727204153.54167: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204153.54174: Set connection var ansible_shell_executable to /bin/sh 19665 1727204153.54181: Set connection var ansible_pipelining to False 19665 1727204153.54197: variable 'ansible_shell_executable' from source: unknown 19665 1727204153.54200: variable 'ansible_connection' from source: unknown 19665 1727204153.54203: variable 'ansible_module_compression' from source: unknown 19665 1727204153.54205: variable 'ansible_shell_type' from source: unknown 19665 1727204153.54207: variable 'ansible_shell_executable' from source: unknown 19665 1727204153.54209: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204153.54212: variable 'ansible_pipelining' from source: unknown 19665 1727204153.54215: variable 'ansible_timeout' from source: unknown 19665 1727204153.54219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204153.54344: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204153.54355: variable 'omit' from source: magic vars 19665 1727204153.54361: starting attempt loop 19665 1727204153.54370: running the handler 19665 1727204153.54375: handler run complete 19665 1727204153.54384: attempt loop complete, returning result 19665 1727204153.54387: _execute() done 19665 1727204153.54390: dumping result to json 19665 1727204153.54392: done dumping result, returning 19665 1727204153.54402: done running TaskExecutor() for managed-node3/TASK: Set interface=LSR-TST-br31 [0affcd87-79f5-0dcc-3ea6-00000000000b] 19665 1727204153.54449: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000000b ok: [managed-node3] => { "ansible_facts": { "interface": "LSR-TST-br31" }, "changed": false } 19665 1727204153.54651: no more pending results, returning what we have 19665 1727204153.54654: results queue empty 19665 1727204153.54656: checking for any_errors_fatal 19665 1727204153.54658: done checking for any_errors_fatal 19665 1727204153.54658: checking for max_fail_percentage 19665 1727204153.54660: done checking for max_fail_percentage 19665 1727204153.54661: checking to see if all hosts have failed and the running result is not ok 19665 1727204153.54663: done checking to see if all hosts have failed 19665 1727204153.54665: getting the remaining hosts for this loop 19665 1727204153.54668: done getting the remaining hosts for this loop 19665 1727204153.54673: getting the next task for host managed-node3 19665 1727204153.54680: done getting next task for host managed-node3 19665 1727204153.54684: ^ task is: TASK: Include the task 'show_interfaces.yml' 19665 1727204153.54685: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204153.54689: getting variables 19665 1727204153.54690: in VariableManager get_vars() 19665 1727204153.54829: Calling all_inventory to load vars for managed-node3 19665 1727204153.54831: Calling groups_inventory to load vars for managed-node3 19665 1727204153.54835: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204153.54844: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000000b 19665 1727204153.54847: WORKER PROCESS EXITING 19665 1727204153.54856: Calling all_plugins_play to load vars for managed-node3 19665 1727204153.54860: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204153.54863: Calling groups_plugins_play to load vars for managed-node3 19665 1727204153.55167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204153.55370: done with get_vars() 19665 1727204153.55379: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:12 Tuesday 24 September 2024 14:55:53 -0400 (0:00:00.025) 0:00:04.421 ***** 19665 1727204153.55474: entering _queue_task() for managed-node3/include_tasks 19665 1727204153.55723: worker is 1 (out of 1 available) 19665 1727204153.55734: exiting _queue_task() for managed-node3/include_tasks 19665 1727204153.55748: done queuing things up, now waiting for results queue to drain 19665 1727204153.55750: waiting for pending results... 19665 1727204153.55996: running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' 19665 1727204153.56096: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000000c 19665 1727204153.56119: variable 'ansible_search_path' from source: unknown 19665 1727204153.56159: calling self._execute() 19665 1727204153.56251: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204153.56262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204153.56277: variable 'omit' from source: magic vars 19665 1727204153.56641: variable 'ansible_distribution_major_version' from source: facts 19665 1727204153.56650: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204153.56655: _execute() done 19665 1727204153.56658: dumping result to json 19665 1727204153.56662: done dumping result, returning 19665 1727204153.56672: done running TaskExecutor() for managed-node3/TASK: Include the task 'show_interfaces.yml' [0affcd87-79f5-0dcc-3ea6-00000000000c] 19665 1727204153.56677: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000000c 19665 1727204153.56759: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000000c 19665 1727204153.56762: WORKER PROCESS EXITING 19665 1727204153.56788: no more pending results, returning what we have 19665 1727204153.56792: in VariableManager get_vars() 19665 1727204153.56826: Calling all_inventory to load vars for managed-node3 19665 1727204153.56828: Calling groups_inventory to load vars for managed-node3 19665 1727204153.56831: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204153.56843: Calling all_plugins_play to load vars for managed-node3 19665 1727204153.56846: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204153.56849: Calling groups_plugins_play to load vars for managed-node3 19665 1727204153.56968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204153.57097: done with get_vars() 19665 1727204153.57102: variable 'ansible_search_path' from source: unknown 19665 1727204153.57113: we have included files to process 19665 1727204153.57114: generating all_blocks data 19665 1727204153.57115: done generating all_blocks data 19665 1727204153.57115: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 19665 1727204153.57116: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 19665 1727204153.57117: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 19665 1727204153.57222: in VariableManager get_vars() 19665 1727204153.57233: done with get_vars() 19665 1727204153.57309: done processing included file 19665 1727204153.57311: iterating over new_blocks loaded from include file 19665 1727204153.57312: in VariableManager get_vars() 19665 1727204153.57319: done with get_vars() 19665 1727204153.57320: filtering new block on tags 19665 1727204153.57333: done filtering new block on tags 19665 1727204153.57335: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node3 19665 1727204153.57340: extending task lists for all hosts with included blocks 19665 1727204153.57380: done extending task lists 19665 1727204153.57381: done processing included files 19665 1727204153.57381: results queue empty 19665 1727204153.57382: checking for any_errors_fatal 19665 1727204153.57384: done checking for any_errors_fatal 19665 1727204153.57385: checking for max_fail_percentage 19665 1727204153.57385: done checking for max_fail_percentage 19665 1727204153.57386: checking to see if all hosts have failed and the running result is not ok 19665 1727204153.57386: done checking to see if all hosts have failed 19665 1727204153.57387: getting the remaining hosts for this loop 19665 1727204153.57388: done getting the remaining hosts for this loop 19665 1727204153.57389: getting the next task for host managed-node3 19665 1727204153.57392: done getting next task for host managed-node3 19665 1727204153.57393: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 19665 1727204153.57394: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204153.57396: getting variables 19665 1727204153.57397: in VariableManager get_vars() 19665 1727204153.57402: Calling all_inventory to load vars for managed-node3 19665 1727204153.57403: Calling groups_inventory to load vars for managed-node3 19665 1727204153.57405: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204153.57408: Calling all_plugins_play to load vars for managed-node3 19665 1727204153.57410: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204153.57411: Calling groups_plugins_play to load vars for managed-node3 19665 1727204153.57497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204153.57605: done with get_vars() 19665 1727204153.57611: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:55:53 -0400 (0:00:00.021) 0:00:04.443 ***** 19665 1727204153.57661: entering _queue_task() for managed-node3/include_tasks 19665 1727204153.57848: worker is 1 (out of 1 available) 19665 1727204153.57862: exiting _queue_task() for managed-node3/include_tasks 19665 1727204153.57875: done queuing things up, now waiting for results queue to drain 19665 1727204153.57877: waiting for pending results... 19665 1727204153.58021: running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' 19665 1727204153.58086: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000000ee 19665 1727204153.58100: variable 'ansible_search_path' from source: unknown 19665 1727204153.58104: variable 'ansible_search_path' from source: unknown 19665 1727204153.58139: calling self._execute() 19665 1727204153.58226: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204153.58259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204153.58263: variable 'omit' from source: magic vars 19665 1727204153.58685: variable 'ansible_distribution_major_version' from source: facts 19665 1727204153.58701: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204153.58710: _execute() done 19665 1727204153.58717: dumping result to json 19665 1727204153.58724: done dumping result, returning 19665 1727204153.58732: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_current_interfaces.yml' [0affcd87-79f5-0dcc-3ea6-0000000000ee] 19665 1727204153.58750: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000ee 19665 1727204153.58853: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000ee 19665 1727204153.58859: WORKER PROCESS EXITING 19665 1727204153.58942: no more pending results, returning what we have 19665 1727204153.58947: in VariableManager get_vars() 19665 1727204153.58980: Calling all_inventory to load vars for managed-node3 19665 1727204153.58982: Calling groups_inventory to load vars for managed-node3 19665 1727204153.58986: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204153.58997: Calling all_plugins_play to load vars for managed-node3 19665 1727204153.59000: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204153.59002: Calling groups_plugins_play to load vars for managed-node3 19665 1727204153.59262: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204153.59472: done with get_vars() 19665 1727204153.59479: variable 'ansible_search_path' from source: unknown 19665 1727204153.59480: variable 'ansible_search_path' from source: unknown 19665 1727204153.59523: we have included files to process 19665 1727204153.59525: generating all_blocks data 19665 1727204153.59526: done generating all_blocks data 19665 1727204153.59527: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 19665 1727204153.59528: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 19665 1727204153.59530: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 19665 1727204153.59834: done processing included file 19665 1727204153.59838: iterating over new_blocks loaded from include file 19665 1727204153.59840: in VariableManager get_vars() 19665 1727204153.59853: done with get_vars() 19665 1727204153.59855: filtering new block on tags 19665 1727204153.59869: done filtering new block on tags 19665 1727204153.59871: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node3 19665 1727204153.59874: extending task lists for all hosts with included blocks 19665 1727204153.59950: done extending task lists 19665 1727204153.59952: done processing included files 19665 1727204153.59953: results queue empty 19665 1727204153.59953: checking for any_errors_fatal 19665 1727204153.59957: done checking for any_errors_fatal 19665 1727204153.59958: checking for max_fail_percentage 19665 1727204153.59959: done checking for max_fail_percentage 19665 1727204153.59960: checking to see if all hosts have failed and the running result is not ok 19665 1727204153.59962: done checking to see if all hosts have failed 19665 1727204153.59967: getting the remaining hosts for this loop 19665 1727204153.59968: done getting the remaining hosts for this loop 19665 1727204153.59984: getting the next task for host managed-node3 19665 1727204153.59988: done getting next task for host managed-node3 19665 1727204153.59990: ^ task is: TASK: Gather current interface info 19665 1727204153.59993: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204153.59995: getting variables 19665 1727204153.59996: in VariableManager get_vars() 19665 1727204153.60003: Calling all_inventory to load vars for managed-node3 19665 1727204153.60005: Calling groups_inventory to load vars for managed-node3 19665 1727204153.60007: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204153.60012: Calling all_plugins_play to load vars for managed-node3 19665 1727204153.60015: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204153.60018: Calling groups_plugins_play to load vars for managed-node3 19665 1727204153.60215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204153.60641: done with get_vars() 19665 1727204153.60649: done getting variables 19665 1727204153.60697: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:55:53 -0400 (0:00:00.030) 0:00:04.473 ***** 19665 1727204153.60727: entering _queue_task() for managed-node3/command 19665 1727204153.61194: worker is 1 (out of 1 available) 19665 1727204153.61206: exiting _queue_task() for managed-node3/command 19665 1727204153.61219: done queuing things up, now waiting for results queue to drain 19665 1727204153.61221: waiting for pending results... 19665 1727204153.61372: running TaskExecutor() for managed-node3/TASK: Gather current interface info 19665 1727204153.61455: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000000fd 19665 1727204153.61472: variable 'ansible_search_path' from source: unknown 19665 1727204153.61476: variable 'ansible_search_path' from source: unknown 19665 1727204153.61504: calling self._execute() 19665 1727204153.61561: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204153.61574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204153.61584: variable 'omit' from source: magic vars 19665 1727204153.61854: variable 'ansible_distribution_major_version' from source: facts 19665 1727204153.61866: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204153.61871: variable 'omit' from source: magic vars 19665 1727204153.61907: variable 'omit' from source: magic vars 19665 1727204153.61931: variable 'omit' from source: magic vars 19665 1727204153.61968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204153.61995: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204153.62011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204153.62028: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204153.62036: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204153.62061: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204153.62067: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204153.62069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204153.62137: Set connection var ansible_connection to ssh 19665 1727204153.62146: Set connection var ansible_shell_type to sh 19665 1727204153.62151: Set connection var ansible_timeout to 10 19665 1727204153.62156: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204153.62163: Set connection var ansible_shell_executable to /bin/sh 19665 1727204153.62171: Set connection var ansible_pipelining to False 19665 1727204153.62188: variable 'ansible_shell_executable' from source: unknown 19665 1727204153.62191: variable 'ansible_connection' from source: unknown 19665 1727204153.62194: variable 'ansible_module_compression' from source: unknown 19665 1727204153.62196: variable 'ansible_shell_type' from source: unknown 19665 1727204153.62200: variable 'ansible_shell_executable' from source: unknown 19665 1727204153.62203: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204153.62206: variable 'ansible_pipelining' from source: unknown 19665 1727204153.62208: variable 'ansible_timeout' from source: unknown 19665 1727204153.62210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204153.62308: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204153.62315: variable 'omit' from source: magic vars 19665 1727204153.62322: starting attempt loop 19665 1727204153.62325: running the handler 19665 1727204153.62337: _low_level_execute_command(): starting 19665 1727204153.62347: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204153.63065: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204153.63084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204153.63109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204153.63120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204153.63209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204153.65534: stdout chunk (state=3): >>>/root <<< 19665 1727204153.65684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204153.65748: stderr chunk (state=3): >>><<< 19665 1727204153.65751: stdout chunk (state=3): >>><<< 19665 1727204153.65775: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19665 1727204153.65788: _low_level_execute_command(): starting 19665 1727204153.65798: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204153.6577666-19952-269539612756355 `" && echo ansible-tmp-1727204153.6577666-19952-269539612756355="` echo /root/.ansible/tmp/ansible-tmp-1727204153.6577666-19952-269539612756355 `" ) && sleep 0' 19665 1727204153.66271: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204153.66285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204153.66303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204153.66316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204153.66338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204153.66378: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204153.66391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204153.66448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204153.69153: stdout chunk (state=3): >>>ansible-tmp-1727204153.6577666-19952-269539612756355=/root/.ansible/tmp/ansible-tmp-1727204153.6577666-19952-269539612756355 <<< 19665 1727204153.69343: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204153.69380: stderr chunk (state=3): >>><<< 19665 1727204153.69384: stdout chunk (state=3): >>><<< 19665 1727204153.69400: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204153.6577666-19952-269539612756355=/root/.ansible/tmp/ansible-tmp-1727204153.6577666-19952-269539612756355 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19665 1727204153.69427: variable 'ansible_module_compression' from source: unknown 19665 1727204153.69478: ANSIBALLZ: Using generic lock for ansible.legacy.command 19665 1727204153.69481: ANSIBALLZ: Acquiring lock 19665 1727204153.69484: ANSIBALLZ: Lock acquired: 140619596462752 19665 1727204153.69487: ANSIBALLZ: Creating module 19665 1727204153.79504: ANSIBALLZ: Writing module into payload 19665 1727204153.79592: ANSIBALLZ: Writing module 19665 1727204153.79608: ANSIBALLZ: Renaming module 19665 1727204153.79611: ANSIBALLZ: Done creating module 19665 1727204153.79629: variable 'ansible_facts' from source: unknown 19665 1727204153.79680: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204153.6577666-19952-269539612756355/AnsiballZ_command.py 19665 1727204153.79790: Sending initial data 19665 1727204153.79800: Sent initial data (156 bytes) 19665 1727204153.80495: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204153.80501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204153.80538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204153.80542: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204153.80545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204153.80606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204153.80609: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204153.80612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204153.80672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204153.83159: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 19665 1727204153.83163: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204153.83204: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 19665 1727204153.83217: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 19665 1727204153.83227: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 19665 1727204153.83296: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmprb35y417 /root/.ansible/tmp/ansible-tmp-1727204153.6577666-19952-269539612756355/AnsiballZ_command.py <<< 19665 1727204153.83335: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204153.84478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204153.84744: stderr chunk (state=3): >>><<< 19665 1727204153.84748: stdout chunk (state=3): >>><<< 19665 1727204153.84750: done transferring module to remote 19665 1727204153.84752: _low_level_execute_command(): starting 19665 1727204153.84755: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204153.6577666-19952-269539612756355/ /root/.ansible/tmp/ansible-tmp-1727204153.6577666-19952-269539612756355/AnsiballZ_command.py && sleep 0' 19665 1727204153.85331: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204153.85345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204153.85359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204153.85379: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204153.85423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204153.85435: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204153.85449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204153.85468: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204153.85480: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204153.85491: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204153.85505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204153.85517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204153.85532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204153.85544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204153.85559: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204153.85575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204153.85651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204153.85681: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204153.85696: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204153.85781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204153.88296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204153.88334: stderr chunk (state=3): >>><<< 19665 1727204153.88338: stdout chunk (state=3): >>><<< 19665 1727204153.88355: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19665 1727204153.88367: _low_level_execute_command(): starting 19665 1727204153.88370: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204153.6577666-19952-269539612756355/AnsiballZ_command.py && sleep 0' 19665 1727204153.88882: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204153.88897: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204153.89071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204153.89127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204154.09081: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:55:54.086914", "end": "2024-09-24 14:55:54.089846", "delta": "0:00:00.002932", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19665 1727204154.10377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204154.10400: stdout chunk (state=3): >>><<< 19665 1727204154.10403: stderr chunk (state=3): >>><<< 19665 1727204154.10556: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:55:54.086914", "end": "2024-09-24 14:55:54.089846", "delta": "0:00:00.002932", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204154.10561: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204153.6577666-19952-269539612756355/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204154.10566: _low_level_execute_command(): starting 19665 1727204154.10569: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204153.6577666-19952-269539612756355/ > /dev/null 2>&1 && sleep 0' 19665 1727204154.11516: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204154.11543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.11561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.11586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.11632: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204154.11653: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204154.11670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.11688: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204154.11704: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204154.11715: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204154.11726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.11741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.11763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.11777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204154.11788: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204154.11801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.11887: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204154.11910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204154.11926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204154.12007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204154.14088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204154.14191: stderr chunk (state=3): >>><<< 19665 1727204154.14194: stdout chunk (state=3): >>><<< 19665 1727204154.14470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204154.14474: handler run complete 19665 1727204154.14476: Evaluated conditional (False): False 19665 1727204154.14478: attempt loop complete, returning result 19665 1727204154.14480: _execute() done 19665 1727204154.14482: dumping result to json 19665 1727204154.14484: done dumping result, returning 19665 1727204154.14486: done running TaskExecutor() for managed-node3/TASK: Gather current interface info [0affcd87-79f5-0dcc-3ea6-0000000000fd] 19665 1727204154.14488: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000fd 19665 1727204154.14575: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000fd 19665 1727204154.14578: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.002932", "end": "2024-09-24 14:55:54.089846", "rc": 0, "start": "2024-09-24 14:55:54.086914" } STDOUT: bonding_masters eth0 lo 19665 1727204154.14647: no more pending results, returning what we have 19665 1727204154.14650: results queue empty 19665 1727204154.14651: checking for any_errors_fatal 19665 1727204154.14653: done checking for any_errors_fatal 19665 1727204154.14653: checking for max_fail_percentage 19665 1727204154.14655: done checking for max_fail_percentage 19665 1727204154.14656: checking to see if all hosts have failed and the running result is not ok 19665 1727204154.14657: done checking to see if all hosts have failed 19665 1727204154.14657: getting the remaining hosts for this loop 19665 1727204154.14659: done getting the remaining hosts for this loop 19665 1727204154.14663: getting the next task for host managed-node3 19665 1727204154.14672: done getting next task for host managed-node3 19665 1727204154.14675: ^ task is: TASK: Set current_interfaces 19665 1727204154.14678: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204154.14682: getting variables 19665 1727204154.14683: in VariableManager get_vars() 19665 1727204154.14710: Calling all_inventory to load vars for managed-node3 19665 1727204154.14713: Calling groups_inventory to load vars for managed-node3 19665 1727204154.14716: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204154.14727: Calling all_plugins_play to load vars for managed-node3 19665 1727204154.14729: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204154.14732: Calling groups_plugins_play to load vars for managed-node3 19665 1727204154.14987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204154.15503: done with get_vars() 19665 1727204154.15516: done getting variables 19665 1727204154.15694: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.549) 0:00:05.023 ***** 19665 1727204154.15727: entering _queue_task() for managed-node3/set_fact 19665 1727204154.16154: worker is 1 (out of 1 available) 19665 1727204154.16201: exiting _queue_task() for managed-node3/set_fact 19665 1727204154.16212: done queuing things up, now waiting for results queue to drain 19665 1727204154.16214: waiting for pending results... 19665 1727204154.16992: running TaskExecutor() for managed-node3/TASK: Set current_interfaces 19665 1727204154.17120: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000000fe 19665 1727204154.17142: variable 'ansible_search_path' from source: unknown 19665 1727204154.17196: variable 'ansible_search_path' from source: unknown 19665 1727204154.17248: calling self._execute() 19665 1727204154.17348: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.17360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.17376: variable 'omit' from source: magic vars 19665 1727204154.17845: variable 'ansible_distribution_major_version' from source: facts 19665 1727204154.17897: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204154.17909: variable 'omit' from source: magic vars 19665 1727204154.17962: variable 'omit' from source: magic vars 19665 1727204154.18048: variable '_current_interfaces' from source: set_fact 19665 1727204154.18104: variable 'omit' from source: magic vars 19665 1727204154.18141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204154.18230: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204154.18245: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204154.18258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204154.18269: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204154.18294: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204154.18297: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.18299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.18368: Set connection var ansible_connection to ssh 19665 1727204154.18374: Set connection var ansible_shell_type to sh 19665 1727204154.18379: Set connection var ansible_timeout to 10 19665 1727204154.18384: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204154.18391: Set connection var ansible_shell_executable to /bin/sh 19665 1727204154.18397: Set connection var ansible_pipelining to False 19665 1727204154.18415: variable 'ansible_shell_executable' from source: unknown 19665 1727204154.18420: variable 'ansible_connection' from source: unknown 19665 1727204154.18422: variable 'ansible_module_compression' from source: unknown 19665 1727204154.18425: variable 'ansible_shell_type' from source: unknown 19665 1727204154.18427: variable 'ansible_shell_executable' from source: unknown 19665 1727204154.18430: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.18432: variable 'ansible_pipelining' from source: unknown 19665 1727204154.18434: variable 'ansible_timeout' from source: unknown 19665 1727204154.18439: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.18534: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204154.18543: variable 'omit' from source: magic vars 19665 1727204154.18548: starting attempt loop 19665 1727204154.18552: running the handler 19665 1727204154.18560: handler run complete 19665 1727204154.18573: attempt loop complete, returning result 19665 1727204154.18576: _execute() done 19665 1727204154.18578: dumping result to json 19665 1727204154.18580: done dumping result, returning 19665 1727204154.18588: done running TaskExecutor() for managed-node3/TASK: Set current_interfaces [0affcd87-79f5-0dcc-3ea6-0000000000fe] 19665 1727204154.18592: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000fe 19665 1727204154.18663: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000fe 19665 1727204154.18668: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 19665 1727204154.18719: no more pending results, returning what we have 19665 1727204154.18722: results queue empty 19665 1727204154.18723: checking for any_errors_fatal 19665 1727204154.18731: done checking for any_errors_fatal 19665 1727204154.18732: checking for max_fail_percentage 19665 1727204154.18733: done checking for max_fail_percentage 19665 1727204154.18734: checking to see if all hosts have failed and the running result is not ok 19665 1727204154.18735: done checking to see if all hosts have failed 19665 1727204154.18736: getting the remaining hosts for this loop 19665 1727204154.18738: done getting the remaining hosts for this loop 19665 1727204154.18742: getting the next task for host managed-node3 19665 1727204154.18749: done getting next task for host managed-node3 19665 1727204154.18752: ^ task is: TASK: Show current_interfaces 19665 1727204154.18755: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204154.18760: getting variables 19665 1727204154.18761: in VariableManager get_vars() 19665 1727204154.18792: Calling all_inventory to load vars for managed-node3 19665 1727204154.18794: Calling groups_inventory to load vars for managed-node3 19665 1727204154.18798: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204154.18808: Calling all_plugins_play to load vars for managed-node3 19665 1727204154.18810: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204154.18813: Calling groups_plugins_play to load vars for managed-node3 19665 1727204154.18985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204154.19139: done with get_vars() 19665 1727204154.19148: done getting variables 19665 1727204154.19238: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.035) 0:00:05.059 ***** 19665 1727204154.19268: entering _queue_task() for managed-node3/debug 19665 1727204154.19269: Creating lock for debug 19665 1727204154.19530: worker is 1 (out of 1 available) 19665 1727204154.19545: exiting _queue_task() for managed-node3/debug 19665 1727204154.19560: done queuing things up, now waiting for results queue to drain 19665 1727204154.19562: waiting for pending results... 19665 1727204154.19861: running TaskExecutor() for managed-node3/TASK: Show current_interfaces 19665 1727204154.19987: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000000ef 19665 1727204154.20007: variable 'ansible_search_path' from source: unknown 19665 1727204154.20016: variable 'ansible_search_path' from source: unknown 19665 1727204154.20061: calling self._execute() 19665 1727204154.20224: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.20239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.20258: variable 'omit' from source: magic vars 19665 1727204154.21070: variable 'ansible_distribution_major_version' from source: facts 19665 1727204154.21094: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204154.21106: variable 'omit' from source: magic vars 19665 1727204154.21201: variable 'omit' from source: magic vars 19665 1727204154.21344: variable 'current_interfaces' from source: set_fact 19665 1727204154.21377: variable 'omit' from source: magic vars 19665 1727204154.21428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204154.21476: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204154.21499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204154.21531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204154.21573: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204154.21603: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204154.21607: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.21609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.21703: Set connection var ansible_connection to ssh 19665 1727204154.21709: Set connection var ansible_shell_type to sh 19665 1727204154.21715: Set connection var ansible_timeout to 10 19665 1727204154.21720: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204154.21728: Set connection var ansible_shell_executable to /bin/sh 19665 1727204154.21739: Set connection var ansible_pipelining to False 19665 1727204154.21770: variable 'ansible_shell_executable' from source: unknown 19665 1727204154.21774: variable 'ansible_connection' from source: unknown 19665 1727204154.21776: variable 'ansible_module_compression' from source: unknown 19665 1727204154.21778: variable 'ansible_shell_type' from source: unknown 19665 1727204154.21780: variable 'ansible_shell_executable' from source: unknown 19665 1727204154.21782: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.21784: variable 'ansible_pipelining' from source: unknown 19665 1727204154.21789: variable 'ansible_timeout' from source: unknown 19665 1727204154.21790: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.21896: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204154.21904: variable 'omit' from source: magic vars 19665 1727204154.21909: starting attempt loop 19665 1727204154.21912: running the handler 19665 1727204154.21948: handler run complete 19665 1727204154.21960: attempt loop complete, returning result 19665 1727204154.21962: _execute() done 19665 1727204154.21967: dumping result to json 19665 1727204154.21970: done dumping result, returning 19665 1727204154.21976: done running TaskExecutor() for managed-node3/TASK: Show current_interfaces [0affcd87-79f5-0dcc-3ea6-0000000000ef] 19665 1727204154.21985: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000ef 19665 1727204154.22068: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000000ef 19665 1727204154.22071: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 19665 1727204154.22116: no more pending results, returning what we have 19665 1727204154.22119: results queue empty 19665 1727204154.22120: checking for any_errors_fatal 19665 1727204154.22125: done checking for any_errors_fatal 19665 1727204154.22125: checking for max_fail_percentage 19665 1727204154.22127: done checking for max_fail_percentage 19665 1727204154.22127: checking to see if all hosts have failed and the running result is not ok 19665 1727204154.22129: done checking to see if all hosts have failed 19665 1727204154.22129: getting the remaining hosts for this loop 19665 1727204154.22131: done getting the remaining hosts for this loop 19665 1727204154.22135: getting the next task for host managed-node3 19665 1727204154.22143: done getting next task for host managed-node3 19665 1727204154.22146: ^ task is: TASK: Include the task 'assert_device_absent.yml' 19665 1727204154.22148: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204154.22151: getting variables 19665 1727204154.22153: in VariableManager get_vars() 19665 1727204154.22180: Calling all_inventory to load vars for managed-node3 19665 1727204154.22182: Calling groups_inventory to load vars for managed-node3 19665 1727204154.22185: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204154.22195: Calling all_plugins_play to load vars for managed-node3 19665 1727204154.22197: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204154.22200: Calling groups_plugins_play to load vars for managed-node3 19665 1727204154.22326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204154.22446: done with get_vars() 19665 1727204154.22453: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:14 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.032) 0:00:05.091 ***** 19665 1727204154.22519: entering _queue_task() for managed-node3/include_tasks 19665 1727204154.22707: worker is 1 (out of 1 available) 19665 1727204154.22720: exiting _queue_task() for managed-node3/include_tasks 19665 1727204154.22733: done queuing things up, now waiting for results queue to drain 19665 1727204154.22734: waiting for pending results... 19665 1727204154.22891: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_absent.yml' 19665 1727204154.22992: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000000d 19665 1727204154.23023: variable 'ansible_search_path' from source: unknown 19665 1727204154.23075: calling self._execute() 19665 1727204154.23168: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.23179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.23193: variable 'omit' from source: magic vars 19665 1727204154.23605: variable 'ansible_distribution_major_version' from source: facts 19665 1727204154.23624: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204154.23642: _execute() done 19665 1727204154.23653: dumping result to json 19665 1727204154.23661: done dumping result, returning 19665 1727204154.23677: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_device_absent.yml' [0affcd87-79f5-0dcc-3ea6-00000000000d] 19665 1727204154.23687: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000000d 19665 1727204154.23808: no more pending results, returning what we have 19665 1727204154.23812: in VariableManager get_vars() 19665 1727204154.23844: Calling all_inventory to load vars for managed-node3 19665 1727204154.23847: Calling groups_inventory to load vars for managed-node3 19665 1727204154.23851: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204154.23866: Calling all_plugins_play to load vars for managed-node3 19665 1727204154.23869: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204154.23872: Calling groups_plugins_play to load vars for managed-node3 19665 1727204154.24132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204154.24371: done with get_vars() 19665 1727204154.24381: variable 'ansible_search_path' from source: unknown 19665 1727204154.24672: we have included files to process 19665 1727204154.24682: generating all_blocks data 19665 1727204154.24685: done generating all_blocks data 19665 1727204154.24691: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19665 1727204154.24692: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19665 1727204154.24695: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19665 1727204154.25040: in VariableManager get_vars() 19665 1727204154.25057: done with get_vars() 19665 1727204154.25218: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000000d 19665 1727204154.25221: WORKER PROCESS EXITING 19665 1727204154.25333: done processing included file 19665 1727204154.25336: iterating over new_blocks loaded from include file 19665 1727204154.25337: in VariableManager get_vars() 19665 1727204154.25349: done with get_vars() 19665 1727204154.25350: filtering new block on tags 19665 1727204154.25368: done filtering new block on tags 19665 1727204154.25370: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node3 19665 1727204154.25375: extending task lists for all hosts with included blocks 19665 1727204154.25537: done extending task lists 19665 1727204154.25538: done processing included files 19665 1727204154.25539: results queue empty 19665 1727204154.25540: checking for any_errors_fatal 19665 1727204154.25543: done checking for any_errors_fatal 19665 1727204154.25544: checking for max_fail_percentage 19665 1727204154.25545: done checking for max_fail_percentage 19665 1727204154.25546: checking to see if all hosts have failed and the running result is not ok 19665 1727204154.25547: done checking to see if all hosts have failed 19665 1727204154.25548: getting the remaining hosts for this loop 19665 1727204154.25549: done getting the remaining hosts for this loop 19665 1727204154.25551: getting the next task for host managed-node3 19665 1727204154.25555: done getting next task for host managed-node3 19665 1727204154.25557: ^ task is: TASK: Include the task 'get_interface_stat.yml' 19665 1727204154.25560: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204154.25562: getting variables 19665 1727204154.25563: in VariableManager get_vars() 19665 1727204154.25573: Calling all_inventory to load vars for managed-node3 19665 1727204154.25575: Calling groups_inventory to load vars for managed-node3 19665 1727204154.25577: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204154.25582: Calling all_plugins_play to load vars for managed-node3 19665 1727204154.25584: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204154.25587: Calling groups_plugins_play to load vars for managed-node3 19665 1727204154.25749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204154.25994: done with get_vars() 19665 1727204154.26003: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.035) 0:00:05.127 ***** 19665 1727204154.26088: entering _queue_task() for managed-node3/include_tasks 19665 1727204154.26357: worker is 1 (out of 1 available) 19665 1727204154.26371: exiting _queue_task() for managed-node3/include_tasks 19665 1727204154.26384: done queuing things up, now waiting for results queue to drain 19665 1727204154.26386: waiting for pending results... 19665 1727204154.26739: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 19665 1727204154.26891: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000119 19665 1727204154.26895: variable 'ansible_search_path' from source: unknown 19665 1727204154.26898: variable 'ansible_search_path' from source: unknown 19665 1727204154.26923: calling self._execute() 19665 1727204154.27071: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.27075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.27082: variable 'omit' from source: magic vars 19665 1727204154.27351: variable 'ansible_distribution_major_version' from source: facts 19665 1727204154.27361: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204154.27368: _execute() done 19665 1727204154.27373: dumping result to json 19665 1727204154.27379: done dumping result, returning 19665 1727204154.27385: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-0dcc-3ea6-000000000119] 19665 1727204154.27391: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000119 19665 1727204154.27475: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000119 19665 1727204154.27478: WORKER PROCESS EXITING 19665 1727204154.27506: no more pending results, returning what we have 19665 1727204154.27511: in VariableManager get_vars() 19665 1727204154.27600: Calling all_inventory to load vars for managed-node3 19665 1727204154.27602: Calling groups_inventory to load vars for managed-node3 19665 1727204154.27605: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204154.27615: Calling all_plugins_play to load vars for managed-node3 19665 1727204154.27618: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204154.27620: Calling groups_plugins_play to load vars for managed-node3 19665 1727204154.27772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204154.27950: done with get_vars() 19665 1727204154.27957: variable 'ansible_search_path' from source: unknown 19665 1727204154.27959: variable 'ansible_search_path' from source: unknown 19665 1727204154.27995: we have included files to process 19665 1727204154.27997: generating all_blocks data 19665 1727204154.27998: done generating all_blocks data 19665 1727204154.28000: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19665 1727204154.28001: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19665 1727204154.28003: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19665 1727204154.28228: done processing included file 19665 1727204154.28230: iterating over new_blocks loaded from include file 19665 1727204154.28231: in VariableManager get_vars() 19665 1727204154.28243: done with get_vars() 19665 1727204154.28244: filtering new block on tags 19665 1727204154.28258: done filtering new block on tags 19665 1727204154.28260: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 19665 1727204154.28267: extending task lists for all hosts with included blocks 19665 1727204154.28361: done extending task lists 19665 1727204154.28363: done processing included files 19665 1727204154.28365: results queue empty 19665 1727204154.28366: checking for any_errors_fatal 19665 1727204154.28369: done checking for any_errors_fatal 19665 1727204154.28369: checking for max_fail_percentage 19665 1727204154.28370: done checking for max_fail_percentage 19665 1727204154.28371: checking to see if all hosts have failed and the running result is not ok 19665 1727204154.28372: done checking to see if all hosts have failed 19665 1727204154.28373: getting the remaining hosts for this loop 19665 1727204154.28374: done getting the remaining hosts for this loop 19665 1727204154.28376: getting the next task for host managed-node3 19665 1727204154.28380: done getting next task for host managed-node3 19665 1727204154.28382: ^ task is: TASK: Get stat for interface {{ interface }} 19665 1727204154.28385: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204154.28387: getting variables 19665 1727204154.28388: in VariableManager get_vars() 19665 1727204154.28396: Calling all_inventory to load vars for managed-node3 19665 1727204154.28398: Calling groups_inventory to load vars for managed-node3 19665 1727204154.28400: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204154.28405: Calling all_plugins_play to load vars for managed-node3 19665 1727204154.28408: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204154.28410: Calling groups_plugins_play to load vars for managed-node3 19665 1727204154.28607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204154.28837: done with get_vars() 19665 1727204154.28846: done getting variables 19665 1727204154.29006: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.029) 0:00:05.156 ***** 19665 1727204154.29036: entering _queue_task() for managed-node3/stat 19665 1727204154.29308: worker is 1 (out of 1 available) 19665 1727204154.29325: exiting _queue_task() for managed-node3/stat 19665 1727204154.29338: done queuing things up, now waiting for results queue to drain 19665 1727204154.29339: waiting for pending results... 19665 1727204154.29591: running TaskExecutor() for managed-node3/TASK: Get stat for interface LSR-TST-br31 19665 1727204154.29723: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000133 19665 1727204154.29740: variable 'ansible_search_path' from source: unknown 19665 1727204154.29748: variable 'ansible_search_path' from source: unknown 19665 1727204154.29798: calling self._execute() 19665 1727204154.29867: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.29871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.29879: variable 'omit' from source: magic vars 19665 1727204154.30276: variable 'ansible_distribution_major_version' from source: facts 19665 1727204154.30295: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204154.30306: variable 'omit' from source: magic vars 19665 1727204154.30358: variable 'omit' from source: magic vars 19665 1727204154.30471: variable 'interface' from source: set_fact 19665 1727204154.30492: variable 'omit' from source: magic vars 19665 1727204154.30537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204154.30580: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204154.30606: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204154.30628: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204154.30643: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204154.30683: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204154.30691: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.30699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.30796: Set connection var ansible_connection to ssh 19665 1727204154.30808: Set connection var ansible_shell_type to sh 19665 1727204154.30820: Set connection var ansible_timeout to 10 19665 1727204154.30829: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204154.30842: Set connection var ansible_shell_executable to /bin/sh 19665 1727204154.30854: Set connection var ansible_pipelining to False 19665 1727204154.30908: variable 'ansible_shell_executable' from source: unknown 19665 1727204154.30920: variable 'ansible_connection' from source: unknown 19665 1727204154.30934: variable 'ansible_module_compression' from source: unknown 19665 1727204154.30942: variable 'ansible_shell_type' from source: unknown 19665 1727204154.30948: variable 'ansible_shell_executable' from source: unknown 19665 1727204154.30955: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.30981: variable 'ansible_pipelining' from source: unknown 19665 1727204154.30994: variable 'ansible_timeout' from source: unknown 19665 1727204154.31016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.31293: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204154.31301: variable 'omit' from source: magic vars 19665 1727204154.31308: starting attempt loop 19665 1727204154.31311: running the handler 19665 1727204154.31320: _low_level_execute_command(): starting 19665 1727204154.31326: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204154.31844: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.31867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.31888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.31903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.31941: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204154.31959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204154.32015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204154.34286: stdout chunk (state=3): >>>/root <<< 19665 1727204154.34448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204154.34501: stderr chunk (state=3): >>><<< 19665 1727204154.34505: stdout chunk (state=3): >>><<< 19665 1727204154.34528: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19665 1727204154.34539: _low_level_execute_command(): starting 19665 1727204154.34551: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204154.3452804-19991-98638697133129 `" && echo ansible-tmp-1727204154.3452804-19991-98638697133129="` echo /root/.ansible/tmp/ansible-tmp-1727204154.3452804-19991-98638697133129 `" ) && sleep 0' 19665 1727204154.35001: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.35014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.35040: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.35059: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.35102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204154.35113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204154.35170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204154.37872: stdout chunk (state=3): >>>ansible-tmp-1727204154.3452804-19991-98638697133129=/root/.ansible/tmp/ansible-tmp-1727204154.3452804-19991-98638697133129 <<< 19665 1727204154.38044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204154.38100: stderr chunk (state=3): >>><<< 19665 1727204154.38103: stdout chunk (state=3): >>><<< 19665 1727204154.38118: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204154.3452804-19991-98638697133129=/root/.ansible/tmp/ansible-tmp-1727204154.3452804-19991-98638697133129 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19665 1727204154.38160: variable 'ansible_module_compression' from source: unknown 19665 1727204154.38211: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 19665 1727204154.38241: variable 'ansible_facts' from source: unknown 19665 1727204154.38310: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204154.3452804-19991-98638697133129/AnsiballZ_stat.py 19665 1727204154.38424: Sending initial data 19665 1727204154.38432: Sent initial data (152 bytes) 19665 1727204154.39126: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.39129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.39168: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.39171: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.39175: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.39229: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204154.39233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204154.39239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204154.39289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204154.41676: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204154.41719: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204154.41766: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp5pdffu79 /root/.ansible/tmp/ansible-tmp-1727204154.3452804-19991-98638697133129/AnsiballZ_stat.py <<< 19665 1727204154.41806: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204154.42646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204154.42760: stderr chunk (state=3): >>><<< 19665 1727204154.42763: stdout chunk (state=3): >>><<< 19665 1727204154.42783: done transferring module to remote 19665 1727204154.42795: _low_level_execute_command(): starting 19665 1727204154.42798: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204154.3452804-19991-98638697133129/ /root/.ansible/tmp/ansible-tmp-1727204154.3452804-19991-98638697133129/AnsiballZ_stat.py && sleep 0' 19665 1727204154.43269: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.43277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.43308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.43320: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.43375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204154.43389: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204154.43448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204154.45910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204154.45967: stderr chunk (state=3): >>><<< 19665 1727204154.45970: stdout chunk (state=3): >>><<< 19665 1727204154.45987: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 19665 1727204154.45994: _low_level_execute_command(): starting 19665 1727204154.45996: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204154.3452804-19991-98638697133129/AnsiballZ_stat.py && sleep 0' 19665 1727204154.46469: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.46476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.46504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204154.46523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.46557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.46605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204154.46617: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204154.46682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 19665 1727204154.63523: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 19665 1727204154.64575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204154.64616: stderr chunk (state=3): >>><<< 19665 1727204154.64620: stdout chunk (state=3): >>><<< 19665 1727204154.64640: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204154.64673: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204154.3452804-19991-98638697133129/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204154.64683: _low_level_execute_command(): starting 19665 1727204154.64688: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204154.3452804-19991-98638697133129/ > /dev/null 2>&1 && sleep 0' 19665 1727204154.65958: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204154.66586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.66597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.66611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.66658: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204154.66667: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204154.66677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.66691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204154.66698: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204154.66704: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204154.66711: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.66721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.66732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.66742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204154.66749: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204154.66758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.66831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204154.66852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204154.66867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204154.66935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204154.68792: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204154.68796: stdout chunk (state=3): >>><<< 19665 1727204154.68802: stderr chunk (state=3): >>><<< 19665 1727204154.68822: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204154.68830: handler run complete 19665 1727204154.68854: attempt loop complete, returning result 19665 1727204154.68857: _execute() done 19665 1727204154.68860: dumping result to json 19665 1727204154.68862: done dumping result, returning 19665 1727204154.68872: done running TaskExecutor() for managed-node3/TASK: Get stat for interface LSR-TST-br31 [0affcd87-79f5-0dcc-3ea6-000000000133] 19665 1727204154.68877: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000133 19665 1727204154.68980: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000133 19665 1727204154.68982: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 19665 1727204154.69039: no more pending results, returning what we have 19665 1727204154.69043: results queue empty 19665 1727204154.69044: checking for any_errors_fatal 19665 1727204154.69045: done checking for any_errors_fatal 19665 1727204154.69046: checking for max_fail_percentage 19665 1727204154.69048: done checking for max_fail_percentage 19665 1727204154.69049: checking to see if all hosts have failed and the running result is not ok 19665 1727204154.69050: done checking to see if all hosts have failed 19665 1727204154.69050: getting the remaining hosts for this loop 19665 1727204154.69052: done getting the remaining hosts for this loop 19665 1727204154.69056: getting the next task for host managed-node3 19665 1727204154.69067: done getting next task for host managed-node3 19665 1727204154.69070: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 19665 1727204154.69073: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204154.69077: getting variables 19665 1727204154.69078: in VariableManager get_vars() 19665 1727204154.69108: Calling all_inventory to load vars for managed-node3 19665 1727204154.69111: Calling groups_inventory to load vars for managed-node3 19665 1727204154.69114: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204154.69125: Calling all_plugins_play to load vars for managed-node3 19665 1727204154.69127: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204154.69129: Calling groups_plugins_play to load vars for managed-node3 19665 1727204154.69307: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204154.69514: done with get_vars() 19665 1727204154.69525: done getting variables 19665 1727204154.69627: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 19665 1727204154.70154: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.411) 0:00:05.568 ***** 19665 1727204154.70187: entering _queue_task() for managed-node3/assert 19665 1727204154.70189: Creating lock for assert 19665 1727204154.71131: worker is 1 (out of 1 available) 19665 1727204154.71146: exiting _queue_task() for managed-node3/assert 19665 1727204154.71161: done queuing things up, now waiting for results queue to drain 19665 1727204154.71162: waiting for pending results... 19665 1727204154.71362: running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'LSR-TST-br31' 19665 1727204154.71460: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000011a 19665 1727204154.71480: variable 'ansible_search_path' from source: unknown 19665 1727204154.71487: variable 'ansible_search_path' from source: unknown 19665 1727204154.71526: calling self._execute() 19665 1727204154.72350: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.72360: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.72374: variable 'omit' from source: magic vars 19665 1727204154.72724: variable 'ansible_distribution_major_version' from source: facts 19665 1727204154.73386: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204154.73398: variable 'omit' from source: magic vars 19665 1727204154.73442: variable 'omit' from source: magic vars 19665 1727204154.73550: variable 'interface' from source: set_fact 19665 1727204154.73575: variable 'omit' from source: magic vars 19665 1727204154.73622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204154.73667: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204154.73693: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204154.73797: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204154.73813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204154.73852: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204154.73856: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.73859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.73958: Set connection var ansible_connection to ssh 19665 1727204154.74680: Set connection var ansible_shell_type to sh 19665 1727204154.74691: Set connection var ansible_timeout to 10 19665 1727204154.74700: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204154.74715: Set connection var ansible_shell_executable to /bin/sh 19665 1727204154.74727: Set connection var ansible_pipelining to False 19665 1727204154.74760: variable 'ansible_shell_executable' from source: unknown 19665 1727204154.74771: variable 'ansible_connection' from source: unknown 19665 1727204154.74777: variable 'ansible_module_compression' from source: unknown 19665 1727204154.74783: variable 'ansible_shell_type' from source: unknown 19665 1727204154.74789: variable 'ansible_shell_executable' from source: unknown 19665 1727204154.74795: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.74801: variable 'ansible_pipelining' from source: unknown 19665 1727204154.74807: variable 'ansible_timeout' from source: unknown 19665 1727204154.74814: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.74955: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204154.74974: variable 'omit' from source: magic vars 19665 1727204154.74984: starting attempt loop 19665 1727204154.74991: running the handler 19665 1727204154.75153: variable 'interface_stat' from source: set_fact 19665 1727204154.75170: Evaluated conditional (not interface_stat.stat.exists): True 19665 1727204154.75179: handler run complete 19665 1727204154.75197: attempt loop complete, returning result 19665 1727204154.75204: _execute() done 19665 1727204154.75210: dumping result to json 19665 1727204154.75217: done dumping result, returning 19665 1727204154.75228: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'LSR-TST-br31' [0affcd87-79f5-0dcc-3ea6-00000000011a] 19665 1727204154.75239: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000011a ok: [managed-node3] => { "changed": false } MSG: All assertions passed 19665 1727204154.75391: no more pending results, returning what we have 19665 1727204154.75394: results queue empty 19665 1727204154.75396: checking for any_errors_fatal 19665 1727204154.75403: done checking for any_errors_fatal 19665 1727204154.75404: checking for max_fail_percentage 19665 1727204154.75405: done checking for max_fail_percentage 19665 1727204154.75406: checking to see if all hosts have failed and the running result is not ok 19665 1727204154.75407: done checking to see if all hosts have failed 19665 1727204154.75408: getting the remaining hosts for this loop 19665 1727204154.75409: done getting the remaining hosts for this loop 19665 1727204154.75413: getting the next task for host managed-node3 19665 1727204154.75421: done getting next task for host managed-node3 19665 1727204154.75424: ^ task is: TASK: meta (flush_handlers) 19665 1727204154.75425: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204154.75429: getting variables 19665 1727204154.75430: in VariableManager get_vars() 19665 1727204154.75458: Calling all_inventory to load vars for managed-node3 19665 1727204154.75461: Calling groups_inventory to load vars for managed-node3 19665 1727204154.75466: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204154.75478: Calling all_plugins_play to load vars for managed-node3 19665 1727204154.75481: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204154.75484: Calling groups_plugins_play to load vars for managed-node3 19665 1727204154.75685: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000011a 19665 1727204154.75689: WORKER PROCESS EXITING 19665 1727204154.75711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204154.75900: done with get_vars() 19665 1727204154.75912: done getting variables 19665 1727204154.75981: in VariableManager get_vars() 19665 1727204154.75990: Calling all_inventory to load vars for managed-node3 19665 1727204154.75993: Calling groups_inventory to load vars for managed-node3 19665 1727204154.75995: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204154.76000: Calling all_plugins_play to load vars for managed-node3 19665 1727204154.76002: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204154.76005: Calling groups_plugins_play to load vars for managed-node3 19665 1727204154.76145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204154.76322: done with get_vars() 19665 1727204154.76341: done queuing things up, now waiting for results queue to drain 19665 1727204154.76344: results queue empty 19665 1727204154.76344: checking for any_errors_fatal 19665 1727204154.76347: done checking for any_errors_fatal 19665 1727204154.76348: checking for max_fail_percentage 19665 1727204154.76349: done checking for max_fail_percentage 19665 1727204154.76349: checking to see if all hosts have failed and the running result is not ok 19665 1727204154.76350: done checking to see if all hosts have failed 19665 1727204154.76356: getting the remaining hosts for this loop 19665 1727204154.76357: done getting the remaining hosts for this loop 19665 1727204154.76359: getting the next task for host managed-node3 19665 1727204154.76363: done getting next task for host managed-node3 19665 1727204154.77267: ^ task is: TASK: meta (flush_handlers) 19665 1727204154.77270: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204154.77273: getting variables 19665 1727204154.77274: in VariableManager get_vars() 19665 1727204154.77283: Calling all_inventory to load vars for managed-node3 19665 1727204154.77285: Calling groups_inventory to load vars for managed-node3 19665 1727204154.77288: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204154.77293: Calling all_plugins_play to load vars for managed-node3 19665 1727204154.77295: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204154.77297: Calling groups_plugins_play to load vars for managed-node3 19665 1727204154.77431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204154.77622: done with get_vars() 19665 1727204154.77631: done getting variables 19665 1727204154.77682: in VariableManager get_vars() 19665 1727204154.77691: Calling all_inventory to load vars for managed-node3 19665 1727204154.77693: Calling groups_inventory to load vars for managed-node3 19665 1727204154.77695: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204154.77699: Calling all_plugins_play to load vars for managed-node3 19665 1727204154.77701: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204154.77703: Calling groups_plugins_play to load vars for managed-node3 19665 1727204154.77831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204154.78539: done with get_vars() 19665 1727204154.78553: done queuing things up, now waiting for results queue to drain 19665 1727204154.78554: results queue empty 19665 1727204154.78555: checking for any_errors_fatal 19665 1727204154.78556: done checking for any_errors_fatal 19665 1727204154.78557: checking for max_fail_percentage 19665 1727204154.78558: done checking for max_fail_percentage 19665 1727204154.78559: checking to see if all hosts have failed and the running result is not ok 19665 1727204154.78560: done checking to see if all hosts have failed 19665 1727204154.78560: getting the remaining hosts for this loop 19665 1727204154.78561: done getting the remaining hosts for this loop 19665 1727204154.78566: getting the next task for host managed-node3 19665 1727204154.78570: done getting next task for host managed-node3 19665 1727204154.78570: ^ task is: None 19665 1727204154.78572: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204154.78573: done queuing things up, now waiting for results queue to drain 19665 1727204154.78574: results queue empty 19665 1727204154.78574: checking for any_errors_fatal 19665 1727204154.78575: done checking for any_errors_fatal 19665 1727204154.78576: checking for max_fail_percentage 19665 1727204154.78577: done checking for max_fail_percentage 19665 1727204154.78577: checking to see if all hosts have failed and the running result is not ok 19665 1727204154.78578: done checking to see if all hosts have failed 19665 1727204154.78580: getting the next task for host managed-node3 19665 1727204154.78582: done getting next task for host managed-node3 19665 1727204154.78583: ^ task is: None 19665 1727204154.78584: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204154.78628: in VariableManager get_vars() 19665 1727204154.78651: done with get_vars() 19665 1727204154.78656: in VariableManager get_vars() 19665 1727204154.78670: done with get_vars() 19665 1727204154.78674: variable 'omit' from source: magic vars 19665 1727204154.78703: in VariableManager get_vars() 19665 1727204154.78715: done with get_vars() 19665 1727204154.78738: variable 'omit' from source: magic vars PLAY [Add test bridge] ********************************************************* 19665 1727204154.79387: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19665 1727204154.79412: getting the remaining hosts for this loop 19665 1727204154.79413: done getting the remaining hosts for this loop 19665 1727204154.79416: getting the next task for host managed-node3 19665 1727204154.79419: done getting next task for host managed-node3 19665 1727204154.79421: ^ task is: TASK: Gathering Facts 19665 1727204154.79422: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204154.79424: getting variables 19665 1727204154.79425: in VariableManager get_vars() 19665 1727204154.79439: Calling all_inventory to load vars for managed-node3 19665 1727204154.79442: Calling groups_inventory to load vars for managed-node3 19665 1727204154.79444: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204154.79449: Calling all_plugins_play to load vars for managed-node3 19665 1727204154.79451: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204154.79454: Calling groups_plugins_play to load vars for managed-node3 19665 1727204154.79585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204154.79752: done with get_vars() 19665 1727204154.79760: done getting variables 19665 1727204154.79804: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Tuesday 24 September 2024 14:55:54 -0400 (0:00:00.096) 0:00:05.664 ***** 19665 1727204154.79829: entering _queue_task() for managed-node3/gather_facts 19665 1727204154.80135: worker is 1 (out of 1 available) 19665 1727204154.80150: exiting _queue_task() for managed-node3/gather_facts 19665 1727204154.80161: done queuing things up, now waiting for results queue to drain 19665 1727204154.80162: waiting for pending results... 19665 1727204154.80446: running TaskExecutor() for managed-node3/TASK: Gathering Facts 19665 1727204154.80555: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000014c 19665 1727204154.80577: variable 'ansible_search_path' from source: unknown 19665 1727204154.80624: calling self._execute() 19665 1727204154.80713: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.80725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.80747: variable 'omit' from source: magic vars 19665 1727204154.81128: variable 'ansible_distribution_major_version' from source: facts 19665 1727204154.81151: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204154.81167: variable 'omit' from source: magic vars 19665 1727204154.81201: variable 'omit' from source: magic vars 19665 1727204154.81244: variable 'omit' from source: magic vars 19665 1727204154.81299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204154.81340: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204154.81373: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204154.81401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204154.81417: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204154.81456: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204154.81467: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.81476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.81588: Set connection var ansible_connection to ssh 19665 1727204154.81604: Set connection var ansible_shell_type to sh 19665 1727204154.81622: Set connection var ansible_timeout to 10 19665 1727204154.81632: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204154.81648: Set connection var ansible_shell_executable to /bin/sh 19665 1727204154.81661: Set connection var ansible_pipelining to False 19665 1727204154.81688: variable 'ansible_shell_executable' from source: unknown 19665 1727204154.81695: variable 'ansible_connection' from source: unknown 19665 1727204154.81705: variable 'ansible_module_compression' from source: unknown 19665 1727204154.81712: variable 'ansible_shell_type' from source: unknown 19665 1727204154.81719: variable 'ansible_shell_executable' from source: unknown 19665 1727204154.81728: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204154.81734: variable 'ansible_pipelining' from source: unknown 19665 1727204154.81743: variable 'ansible_timeout' from source: unknown 19665 1727204154.81750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204154.81934: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204154.81957: variable 'omit' from source: magic vars 19665 1727204154.81967: starting attempt loop 19665 1727204154.81974: running the handler 19665 1727204154.81992: variable 'ansible_facts' from source: unknown 19665 1727204154.82013: _low_level_execute_command(): starting 19665 1727204154.82030: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204154.82829: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204154.82848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.82862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.82886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.82942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204154.82956: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204154.82973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.82992: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204154.83003: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204154.83017: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204154.83032: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.83049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.83063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.83078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204154.83089: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204154.83103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.83190: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204154.83213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204154.83232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204154.83335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204154.84966: stdout chunk (state=3): >>>/root <<< 19665 1727204154.85150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204154.85154: stdout chunk (state=3): >>><<< 19665 1727204154.85156: stderr chunk (state=3): >>><<< 19665 1727204154.85274: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204154.85278: _low_level_execute_command(): starting 19665 1727204154.85281: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204154.8518114-20017-46781808280042 `" && echo ansible-tmp-1727204154.8518114-20017-46781808280042="` echo /root/.ansible/tmp/ansible-tmp-1727204154.8518114-20017-46781808280042 `" ) && sleep 0' 19665 1727204154.86255: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.86259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.86286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.86310: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.86313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204154.86316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.86375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204154.86459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204154.86467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204154.86523: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204154.88369: stdout chunk (state=3): >>>ansible-tmp-1727204154.8518114-20017-46781808280042=/root/.ansible/tmp/ansible-tmp-1727204154.8518114-20017-46781808280042 <<< 19665 1727204154.88590: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204154.88670: stderr chunk (state=3): >>><<< 19665 1727204154.88673: stdout chunk (state=3): >>><<< 19665 1727204154.88870: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204154.8518114-20017-46781808280042=/root/.ansible/tmp/ansible-tmp-1727204154.8518114-20017-46781808280042 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204154.88874: variable 'ansible_module_compression' from source: unknown 19665 1727204154.88877: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19665 1727204154.88879: variable 'ansible_facts' from source: unknown 19665 1727204154.89054: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204154.8518114-20017-46781808280042/AnsiballZ_setup.py 19665 1727204154.90203: Sending initial data 19665 1727204154.90208: Sent initial data (153 bytes) 19665 1727204154.91830: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204154.91957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.91977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.91996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.92110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204154.92123: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204154.92139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.92157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204154.92171: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204154.92182: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204154.92193: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.92205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.92219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.92231: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204154.92244: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204154.92257: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.92335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204154.92360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204154.92378: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204154.92452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204154.94232: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204154.94277: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204154.94293: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp5fr4dk1n /root/.ansible/tmp/ansible-tmp-1727204154.8518114-20017-46781808280042/AnsiballZ_setup.py <<< 19665 1727204154.94324: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204154.97429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204154.97562: stderr chunk (state=3): >>><<< 19665 1727204154.97568: stdout chunk (state=3): >>><<< 19665 1727204154.97571: done transferring module to remote 19665 1727204154.97573: _low_level_execute_command(): starting 19665 1727204154.97576: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204154.8518114-20017-46781808280042/ /root/.ansible/tmp/ansible-tmp-1727204154.8518114-20017-46781808280042/AnsiballZ_setup.py && sleep 0' 19665 1727204154.98857: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204154.98874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.98888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.98910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.99019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204154.99034: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204154.99051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.99075: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204154.99088: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204154.99098: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204154.99112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204154.99125: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204154.99142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204154.99154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204154.99167: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204154.99183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204154.99334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204154.99352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204154.99377: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204154.99546: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204155.01772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204155.01777: stdout chunk (state=3): >>><<< 19665 1727204155.01780: stderr chunk (state=3): >>><<< 19665 1727204155.01878: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204155.01882: _low_level_execute_command(): starting 19665 1727204155.01884: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204154.8518114-20017-46781808280042/AnsiballZ_setup.py && sleep 0' 19665 1727204155.03384: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204155.03388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204155.03414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204155.03528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204155.03535: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204155.03549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204155.03568: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204155.03878: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204155.03962: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204155.03979: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204155.03988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204155.04001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204155.04009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204155.04017: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204155.04027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204155.04104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204155.04123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204155.04136: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204155.04214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204155.53836: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2811, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 721, "free": 2811}, "nocache": {"free": 3270, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 501, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282091520, "block_size": 4096, "block_total": 65519355, "block_available": 64521995, "block_used": 997360, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2r<<< 19665 1727204155.53865: stdout chunk (state=3): >>>bT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatte<<< 19665 1727204155.53875: stdout chunk (state=3): >>>r_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "55", "epoch": "1727204155", "epoch_int": "1727204155", "date": "2024-09-24", "time": "14:55:55", "iso8601_micro": "2024-09-24T18:55:55.534338Z", "iso8601": "2024-09-24T18:55:55Z", "iso8601_basic": "20240924T145555534338", "iso8601_basic_short": "20240924T145555", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.36, "5m": 0.34, "15m": 0.17}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19665 1727204155.55551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204155.55555: stdout chunk (state=3): >>><<< 19665 1727204155.55558: stderr chunk (state=3): >>><<< 19665 1727204155.55878: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2811, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 721, "free": 2811}, "nocache": {"free": 3270, "used": 262}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 501, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282091520, "block_size": 4096, "block_total": 65519355, "block_available": 64521995, "block_used": 997360, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "55", "second": "55", "epoch": "1727204155", "epoch_int": "1727204155", "date": "2024-09-24", "time": "14:55:55", "iso8601_micro": "2024-09-24T18:55:55.534338Z", "iso8601": "2024-09-24T18:55:55Z", "iso8601_basic": "20240924T145555534338", "iso8601_basic_short": "20240924T145555", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 0.36, "5m": 0.34, "15m": 0.17}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204155.55998: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204154.8518114-20017-46781808280042/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204155.56028: _low_level_execute_command(): starting 19665 1727204155.56038: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204154.8518114-20017-46781808280042/ > /dev/null 2>&1 && sleep 0' 19665 1727204155.56717: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204155.56734: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204155.56757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204155.56784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204155.56832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204155.56847: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204155.56870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204155.56889: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204155.56901: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204155.56912: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204155.56924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204155.56937: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204155.56953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204155.56976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204155.56988: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204155.57003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204155.57083: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204155.57109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204155.57126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204155.57205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204155.59097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204155.59101: stdout chunk (state=3): >>><<< 19665 1727204155.59103: stderr chunk (state=3): >>><<< 19665 1727204155.59383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204155.59386: handler run complete 19665 1727204155.59389: variable 'ansible_facts' from source: unknown 19665 1727204155.59809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204155.60465: variable 'ansible_facts' from source: unknown 19665 1727204155.60708: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204155.60899: attempt loop complete, returning result 19665 1727204155.60914: _execute() done 19665 1727204155.60928: dumping result to json 19665 1727204155.60972: done dumping result, returning 19665 1727204155.61000: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-0dcc-3ea6-00000000014c] 19665 1727204155.61011: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000014c ok: [managed-node3] 19665 1727204155.61644: no more pending results, returning what we have 19665 1727204155.61647: results queue empty 19665 1727204155.61649: checking for any_errors_fatal 19665 1727204155.61650: done checking for any_errors_fatal 19665 1727204155.61651: checking for max_fail_percentage 19665 1727204155.61653: done checking for max_fail_percentage 19665 1727204155.61653: checking to see if all hosts have failed and the running result is not ok 19665 1727204155.61655: done checking to see if all hosts have failed 19665 1727204155.61655: getting the remaining hosts for this loop 19665 1727204155.61657: done getting the remaining hosts for this loop 19665 1727204155.61662: getting the next task for host managed-node3 19665 1727204155.61671: done getting next task for host managed-node3 19665 1727204155.61673: ^ task is: TASK: meta (flush_handlers) 19665 1727204155.61682: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204155.61687: getting variables 19665 1727204155.61689: in VariableManager get_vars() 19665 1727204155.61723: Calling all_inventory to load vars for managed-node3 19665 1727204155.61725: Calling groups_inventory to load vars for managed-node3 19665 1727204155.61728: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204155.61739: Calling all_plugins_play to load vars for managed-node3 19665 1727204155.61742: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204155.61745: Calling groups_plugins_play to load vars for managed-node3 19665 1727204155.61927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204155.62613: done with get_vars() 19665 1727204155.62634: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000014c 19665 1727204155.62637: WORKER PROCESS EXITING 19665 1727204155.62639: done getting variables 19665 1727204155.62726: in VariableManager get_vars() 19665 1727204155.62740: Calling all_inventory to load vars for managed-node3 19665 1727204155.62743: Calling groups_inventory to load vars for managed-node3 19665 1727204155.62745: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204155.62750: Calling all_plugins_play to load vars for managed-node3 19665 1727204155.62752: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204155.62760: Calling groups_plugins_play to load vars for managed-node3 19665 1727204155.63141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204155.63374: done with get_vars() 19665 1727204155.63387: done queuing things up, now waiting for results queue to drain 19665 1727204155.63389: results queue empty 19665 1727204155.63390: checking for any_errors_fatal 19665 1727204155.63459: done checking for any_errors_fatal 19665 1727204155.63461: checking for max_fail_percentage 19665 1727204155.63462: done checking for max_fail_percentage 19665 1727204155.63466: checking to see if all hosts have failed and the running result is not ok 19665 1727204155.63466: done checking to see if all hosts have failed 19665 1727204155.63467: getting the remaining hosts for this loop 19665 1727204155.63468: done getting the remaining hosts for this loop 19665 1727204155.63471: getting the next task for host managed-node3 19665 1727204155.63476: done getting next task for host managed-node3 19665 1727204155.63478: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19665 1727204155.63480: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204155.63491: getting variables 19665 1727204155.63492: in VariableManager get_vars() 19665 1727204155.63506: Calling all_inventory to load vars for managed-node3 19665 1727204155.63509: Calling groups_inventory to load vars for managed-node3 19665 1727204155.63511: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204155.63515: Calling all_plugins_play to load vars for managed-node3 19665 1727204155.63518: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204155.63521: Calling groups_plugins_play to load vars for managed-node3 19665 1727204155.63674: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204155.64911: done with get_vars() 19665 1727204155.64920: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.853) 0:00:06.517 ***** 19665 1727204155.65137: entering _queue_task() for managed-node3/include_tasks 19665 1727204155.65758: worker is 1 (out of 1 available) 19665 1727204155.65772: exiting _queue_task() for managed-node3/include_tasks 19665 1727204155.65784: done queuing things up, now waiting for results queue to drain 19665 1727204155.65786: waiting for pending results... 19665 1727204155.66754: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19665 1727204155.66870: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000014 19665 1727204155.66943: variable 'ansible_search_path' from source: unknown 19665 1727204155.66952: variable 'ansible_search_path' from source: unknown 19665 1727204155.67072: calling self._execute() 19665 1727204155.67277: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204155.67289: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204155.67301: variable 'omit' from source: magic vars 19665 1727204155.68183: variable 'ansible_distribution_major_version' from source: facts 19665 1727204155.68201: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204155.68210: _execute() done 19665 1727204155.68216: dumping result to json 19665 1727204155.68222: done dumping result, returning 19665 1727204155.68236: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-0dcc-3ea6-000000000014] 19665 1727204155.68347: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000014 19665 1727204155.68487: no more pending results, returning what we have 19665 1727204155.68493: in VariableManager get_vars() 19665 1727204155.68538: Calling all_inventory to load vars for managed-node3 19665 1727204155.68540: Calling groups_inventory to load vars for managed-node3 19665 1727204155.68543: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204155.68556: Calling all_plugins_play to load vars for managed-node3 19665 1727204155.68559: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204155.68562: Calling groups_plugins_play to load vars for managed-node3 19665 1727204155.68803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204155.69009: done with get_vars() 19665 1727204155.69017: variable 'ansible_search_path' from source: unknown 19665 1727204155.69018: variable 'ansible_search_path' from source: unknown 19665 1727204155.69050: we have included files to process 19665 1727204155.69051: generating all_blocks data 19665 1727204155.69053: done generating all_blocks data 19665 1727204155.69054: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19665 1727204155.69055: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19665 1727204155.69057: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19665 1727204155.69075: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000014 19665 1727204155.69084: WORKER PROCESS EXITING 19665 1727204155.70674: done processing included file 19665 1727204155.70677: iterating over new_blocks loaded from include file 19665 1727204155.70678: in VariableManager get_vars() 19665 1727204155.70699: done with get_vars() 19665 1727204155.70701: filtering new block on tags 19665 1727204155.70718: done filtering new block on tags 19665 1727204155.70721: in VariableManager get_vars() 19665 1727204155.70741: done with get_vars() 19665 1727204155.70742: filtering new block on tags 19665 1727204155.70761: done filtering new block on tags 19665 1727204155.70765: in VariableManager get_vars() 19665 1727204155.70784: done with get_vars() 19665 1727204155.70785: filtering new block on tags 19665 1727204155.70801: done filtering new block on tags 19665 1727204155.70803: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 19665 1727204155.70808: extending task lists for all hosts with included blocks 19665 1727204155.71606: done extending task lists 19665 1727204155.71608: done processing included files 19665 1727204155.71609: results queue empty 19665 1727204155.71609: checking for any_errors_fatal 19665 1727204155.71611: done checking for any_errors_fatal 19665 1727204155.71612: checking for max_fail_percentage 19665 1727204155.71613: done checking for max_fail_percentage 19665 1727204155.71613: checking to see if all hosts have failed and the running result is not ok 19665 1727204155.71615: done checking to see if all hosts have failed 19665 1727204155.71616: getting the remaining hosts for this loop 19665 1727204155.71617: done getting the remaining hosts for this loop 19665 1727204155.71619: getting the next task for host managed-node3 19665 1727204155.71623: done getting next task for host managed-node3 19665 1727204155.71626: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19665 1727204155.71628: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204155.71638: getting variables 19665 1727204155.71639: in VariableManager get_vars() 19665 1727204155.71654: Calling all_inventory to load vars for managed-node3 19665 1727204155.71656: Calling groups_inventory to load vars for managed-node3 19665 1727204155.71658: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204155.71663: Calling all_plugins_play to load vars for managed-node3 19665 1727204155.71667: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204155.71670: Calling groups_plugins_play to load vars for managed-node3 19665 1727204155.72498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204155.72698: done with get_vars() 19665 1727204155.72709: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.076) 0:00:06.594 ***** 19665 1727204155.72784: entering _queue_task() for managed-node3/setup 19665 1727204155.73121: worker is 1 (out of 1 available) 19665 1727204155.73138: exiting _queue_task() for managed-node3/setup 19665 1727204155.73150: done queuing things up, now waiting for results queue to drain 19665 1727204155.73152: waiting for pending results... 19665 1727204155.73455: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19665 1727204155.73589: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000018d 19665 1727204155.73613: variable 'ansible_search_path' from source: unknown 19665 1727204155.73623: variable 'ansible_search_path' from source: unknown 19665 1727204155.73703: calling self._execute() 19665 1727204155.73792: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204155.73803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204155.73819: variable 'omit' from source: magic vars 19665 1727204155.74258: variable 'ansible_distribution_major_version' from source: facts 19665 1727204155.74279: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204155.75162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204155.77957: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204155.78049: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204155.78095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204155.78142: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204155.78179: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204155.78275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204155.78312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204155.78603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204155.78654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204155.78679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204155.78741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204155.78771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204155.78806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204155.78853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204155.78874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204155.79050: variable '__network_required_facts' from source: role '' defaults 19665 1727204155.79066: variable 'ansible_facts' from source: unknown 19665 1727204155.79163: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 19665 1727204155.79173: when evaluation is False, skipping this task 19665 1727204155.79179: _execute() done 19665 1727204155.79184: dumping result to json 19665 1727204155.79190: done dumping result, returning 19665 1727204155.79200: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-0dcc-3ea6-00000000018d] 19665 1727204155.79207: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000018d 19665 1727204155.79311: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000018d 19665 1727204155.79317: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204155.79381: no more pending results, returning what we have 19665 1727204155.79385: results queue empty 19665 1727204155.79387: checking for any_errors_fatal 19665 1727204155.79388: done checking for any_errors_fatal 19665 1727204155.79389: checking for max_fail_percentage 19665 1727204155.79390: done checking for max_fail_percentage 19665 1727204155.79391: checking to see if all hosts have failed and the running result is not ok 19665 1727204155.79392: done checking to see if all hosts have failed 19665 1727204155.79393: getting the remaining hosts for this loop 19665 1727204155.79395: done getting the remaining hosts for this loop 19665 1727204155.79399: getting the next task for host managed-node3 19665 1727204155.79408: done getting next task for host managed-node3 19665 1727204155.79413: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 19665 1727204155.79416: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204155.79429: getting variables 19665 1727204155.79431: in VariableManager get_vars() 19665 1727204155.79475: Calling all_inventory to load vars for managed-node3 19665 1727204155.79478: Calling groups_inventory to load vars for managed-node3 19665 1727204155.79480: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204155.79491: Calling all_plugins_play to load vars for managed-node3 19665 1727204155.79494: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204155.79497: Calling groups_plugins_play to load vars for managed-node3 19665 1727204155.79919: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204155.80474: done with get_vars() 19665 1727204155.80484: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.079) 0:00:06.674 ***** 19665 1727204155.80743: entering _queue_task() for managed-node3/stat 19665 1727204155.81038: worker is 1 (out of 1 available) 19665 1727204155.81052: exiting _queue_task() for managed-node3/stat 19665 1727204155.81062: done queuing things up, now waiting for results queue to drain 19665 1727204155.81066: waiting for pending results... 19665 1727204155.81420: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 19665 1727204155.81588: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000018f 19665 1727204155.81608: variable 'ansible_search_path' from source: unknown 19665 1727204155.81620: variable 'ansible_search_path' from source: unknown 19665 1727204155.81667: calling self._execute() 19665 1727204155.81758: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204155.81772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204155.81786: variable 'omit' from source: magic vars 19665 1727204155.82441: variable 'ansible_distribution_major_version' from source: facts 19665 1727204155.82459: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204155.82623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204155.82991: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204155.83041: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204155.83091: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204155.83129: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204155.83235: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204155.83302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204155.83417: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204155.83454: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204155.83766: variable '__network_is_ostree' from source: set_fact 19665 1727204155.83780: Evaluated conditional (not __network_is_ostree is defined): False 19665 1727204155.83788: when evaluation is False, skipping this task 19665 1727204155.83796: _execute() done 19665 1727204155.83803: dumping result to json 19665 1727204155.83811: done dumping result, returning 19665 1727204155.83823: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-0dcc-3ea6-00000000018f] 19665 1727204155.83844: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000018f skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19665 1727204155.84005: no more pending results, returning what we have 19665 1727204155.84009: results queue empty 19665 1727204155.84011: checking for any_errors_fatal 19665 1727204155.84018: done checking for any_errors_fatal 19665 1727204155.84019: checking for max_fail_percentage 19665 1727204155.84020: done checking for max_fail_percentage 19665 1727204155.84021: checking to see if all hosts have failed and the running result is not ok 19665 1727204155.84022: done checking to see if all hosts have failed 19665 1727204155.84023: getting the remaining hosts for this loop 19665 1727204155.84025: done getting the remaining hosts for this loop 19665 1727204155.84029: getting the next task for host managed-node3 19665 1727204155.84039: done getting next task for host managed-node3 19665 1727204155.84043: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19665 1727204155.84046: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204155.84060: getting variables 19665 1727204155.84062: in VariableManager get_vars() 19665 1727204155.84106: Calling all_inventory to load vars for managed-node3 19665 1727204155.84109: Calling groups_inventory to load vars for managed-node3 19665 1727204155.84112: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204155.84124: Calling all_plugins_play to load vars for managed-node3 19665 1727204155.84127: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204155.84129: Calling groups_plugins_play to load vars for managed-node3 19665 1727204155.84320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204155.84556: done with get_vars() 19665 1727204155.84572: done getting variables 19665 1727204155.84722: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000018f 19665 1727204155.84726: WORKER PROCESS EXITING 19665 1727204155.84771: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.040) 0:00:06.714 ***** 19665 1727204155.84926: entering _queue_task() for managed-node3/set_fact 19665 1727204155.85711: worker is 1 (out of 1 available) 19665 1727204155.85722: exiting _queue_task() for managed-node3/set_fact 19665 1727204155.85734: done queuing things up, now waiting for results queue to drain 19665 1727204155.85739: waiting for pending results... 19665 1727204155.86022: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19665 1727204155.86177: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000190 19665 1727204155.86199: variable 'ansible_search_path' from source: unknown 19665 1727204155.86206: variable 'ansible_search_path' from source: unknown 19665 1727204155.86252: calling self._execute() 19665 1727204155.86353: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204155.86370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204155.86387: variable 'omit' from source: magic vars 19665 1727204155.86852: variable 'ansible_distribution_major_version' from source: facts 19665 1727204155.86873: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204155.87059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204155.87357: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204155.87450: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204155.87498: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204155.87535: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204155.87634: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204155.87671: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204155.87711: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204155.87745: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204155.87851: variable '__network_is_ostree' from source: set_fact 19665 1727204155.87866: Evaluated conditional (not __network_is_ostree is defined): False 19665 1727204155.87875: when evaluation is False, skipping this task 19665 1727204155.87881: _execute() done 19665 1727204155.87888: dumping result to json 19665 1727204155.87901: done dumping result, returning 19665 1727204155.87913: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-0dcc-3ea6-000000000190] 19665 1727204155.87928: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000190 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19665 1727204155.88083: no more pending results, returning what we have 19665 1727204155.88087: results queue empty 19665 1727204155.88088: checking for any_errors_fatal 19665 1727204155.88095: done checking for any_errors_fatal 19665 1727204155.88095: checking for max_fail_percentage 19665 1727204155.88097: done checking for max_fail_percentage 19665 1727204155.88098: checking to see if all hosts have failed and the running result is not ok 19665 1727204155.88099: done checking to see if all hosts have failed 19665 1727204155.88100: getting the remaining hosts for this loop 19665 1727204155.88102: done getting the remaining hosts for this loop 19665 1727204155.88106: getting the next task for host managed-node3 19665 1727204155.88115: done getting next task for host managed-node3 19665 1727204155.88119: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 19665 1727204155.88123: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204155.88139: getting variables 19665 1727204155.88142: in VariableManager get_vars() 19665 1727204155.88184: Calling all_inventory to load vars for managed-node3 19665 1727204155.88187: Calling groups_inventory to load vars for managed-node3 19665 1727204155.88190: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204155.88201: Calling all_plugins_play to load vars for managed-node3 19665 1727204155.88204: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204155.88207: Calling groups_plugins_play to load vars for managed-node3 19665 1727204155.88450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204155.88675: done with get_vars() 19665 1727204155.88687: done getting variables 19665 1727204155.88953: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000190 19665 1727204155.88957: WORKER PROCESS EXITING TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:55:55 -0400 (0:00:00.044) 0:00:06.759 ***** 19665 1727204155.89246: entering _queue_task() for managed-node3/service_facts 19665 1727204155.89248: Creating lock for service_facts 19665 1727204155.89720: worker is 1 (out of 1 available) 19665 1727204155.89742: exiting _queue_task() for managed-node3/service_facts 19665 1727204155.89755: done queuing things up, now waiting for results queue to drain 19665 1727204155.89756: waiting for pending results... 19665 1727204155.90184: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 19665 1727204155.90420: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000192 19665 1727204155.90448: variable 'ansible_search_path' from source: unknown 19665 1727204155.90455: variable 'ansible_search_path' from source: unknown 19665 1727204155.90497: calling self._execute() 19665 1727204155.90594: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204155.90605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204155.90617: variable 'omit' from source: magic vars 19665 1727204155.91012: variable 'ansible_distribution_major_version' from source: facts 19665 1727204155.91030: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204155.91043: variable 'omit' from source: magic vars 19665 1727204155.91109: variable 'omit' from source: magic vars 19665 1727204155.91151: variable 'omit' from source: magic vars 19665 1727204155.91208: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204155.91251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204155.91279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204155.91302: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204155.91321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204155.91356: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204155.91366: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204155.91373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204155.91480: Set connection var ansible_connection to ssh 19665 1727204155.91493: Set connection var ansible_shell_type to sh 19665 1727204155.91503: Set connection var ansible_timeout to 10 19665 1727204155.91513: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204155.91534: Set connection var ansible_shell_executable to /bin/sh 19665 1727204155.91550: Set connection var ansible_pipelining to False 19665 1727204155.91579: variable 'ansible_shell_executable' from source: unknown 19665 1727204155.91588: variable 'ansible_connection' from source: unknown 19665 1727204155.91595: variable 'ansible_module_compression' from source: unknown 19665 1727204155.91602: variable 'ansible_shell_type' from source: unknown 19665 1727204155.91608: variable 'ansible_shell_executable' from source: unknown 19665 1727204155.91616: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204155.91624: variable 'ansible_pipelining' from source: unknown 19665 1727204155.91672: variable 'ansible_timeout' from source: unknown 19665 1727204155.91681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204155.92095: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204155.92110: variable 'omit' from source: magic vars 19665 1727204155.92220: starting attempt loop 19665 1727204155.92228: running the handler 19665 1727204155.92248: _low_level_execute_command(): starting 19665 1727204155.92259: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204155.93073: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204155.93101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204155.93117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204155.93139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204155.93186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204155.93199: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204155.93220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204155.93242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204155.93255: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204155.93268: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204155.93280: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204155.93294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204155.93311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204155.93329: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204155.93343: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204155.93359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204155.93447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204155.93473: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204155.93492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204155.93573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204155.95190: stdout chunk (state=3): >>>/root <<< 19665 1727204155.95395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204155.95398: stdout chunk (state=3): >>><<< 19665 1727204155.95401: stderr chunk (state=3): >>><<< 19665 1727204155.95522: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204155.95526: _low_level_execute_command(): starting 19665 1727204155.95529: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204155.9542124-20154-62695197669339 `" && echo ansible-tmp-1727204155.9542124-20154-62695197669339="` echo /root/.ansible/tmp/ansible-tmp-1727204155.9542124-20154-62695197669339 `" ) && sleep 0' 19665 1727204155.96782: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204155.96786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204155.96807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204155.96956: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204155.96975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204155.96995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204155.97008: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204155.97019: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204155.97031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204155.97045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204155.97061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204155.97076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204155.97086: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204155.97098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204155.97177: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204155.97201: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204155.97217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204155.97295: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204155.99127: stdout chunk (state=3): >>>ansible-tmp-1727204155.9542124-20154-62695197669339=/root/.ansible/tmp/ansible-tmp-1727204155.9542124-20154-62695197669339 <<< 19665 1727204155.99335: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204155.99339: stdout chunk (state=3): >>><<< 19665 1727204155.99342: stderr chunk (state=3): >>><<< 19665 1727204155.99671: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204155.9542124-20154-62695197669339=/root/.ansible/tmp/ansible-tmp-1727204155.9542124-20154-62695197669339 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204155.99676: variable 'ansible_module_compression' from source: unknown 19665 1727204155.99679: ANSIBALLZ: Using lock for service_facts 19665 1727204155.99681: ANSIBALLZ: Acquiring lock 19665 1727204155.99683: ANSIBALLZ: Lock acquired: 140619592746288 19665 1727204155.99685: ANSIBALLZ: Creating module 19665 1727204156.22472: ANSIBALLZ: Writing module into payload 19665 1727204156.22609: ANSIBALLZ: Writing module 19665 1727204156.22651: ANSIBALLZ: Renaming module 19665 1727204156.22670: ANSIBALLZ: Done creating module 19665 1727204156.22690: variable 'ansible_facts' from source: unknown 19665 1727204156.22780: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204155.9542124-20154-62695197669339/AnsiballZ_service_facts.py 19665 1727204156.23091: Sending initial data 19665 1727204156.23094: Sent initial data (161 bytes) 19665 1727204156.23796: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204156.23799: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204156.23833: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204156.23837: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204156.23841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204156.23890: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204156.23901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204156.23963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204156.25771: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204156.25811: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204156.25849: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp8xx24mdy /root/.ansible/tmp/ansible-tmp-1727204155.9542124-20154-62695197669339/AnsiballZ_service_facts.py <<< 19665 1727204156.25885: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204156.26687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204156.26803: stderr chunk (state=3): >>><<< 19665 1727204156.26806: stdout chunk (state=3): >>><<< 19665 1727204156.26824: done transferring module to remote 19665 1727204156.26833: _low_level_execute_command(): starting 19665 1727204156.26840: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204155.9542124-20154-62695197669339/ /root/.ansible/tmp/ansible-tmp-1727204155.9542124-20154-62695197669339/AnsiballZ_service_facts.py && sleep 0' 19665 1727204156.27317: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204156.27325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204156.27360: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204156.27372: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204156.27385: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204156.27396: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204156.27404: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204156.27410: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204156.27421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204156.27430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204156.27446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204156.27451: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204156.27499: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204156.27512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204156.27523: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204156.27579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204156.29284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204156.29337: stderr chunk (state=3): >>><<< 19665 1727204156.29347: stdout chunk (state=3): >>><<< 19665 1727204156.29368: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204156.29371: _low_level_execute_command(): starting 19665 1727204156.29376: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204155.9542124-20154-62695197669339/AnsiballZ_service_facts.py && sleep 0' 19665 1727204156.29836: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204156.29845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204156.29877: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204156.29910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204156.29962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204156.29972: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204156.30029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204157.60179: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 19665 1727204157.60213: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stop<<< 19665 1727204157.60231: stdout chunk (state=3): >>>ped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtim<<< 19665 1727204157.60269: stdout chunk (state=3): >>>e-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name<<< 19665 1727204157.60276: stdout chunk (state=3): >>>": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 19665 1727204157.61587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204157.62258: stderr chunk (state=3): >>><<< 19665 1727204157.62262: stdout chunk (state=3): >>><<< 19665 1727204157.62271: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204157.62588: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204155.9542124-20154-62695197669339/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204157.62604: _low_level_execute_command(): starting 19665 1727204157.62614: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204155.9542124-20154-62695197669339/ > /dev/null 2>&1 && sleep 0' 19665 1727204157.63570: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204157.63574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204157.63612: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204157.63617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204157.63620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204157.63623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204157.63692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204157.63696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204157.63702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204157.63754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204157.65509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204157.65568: stderr chunk (state=3): >>><<< 19665 1727204157.65571: stdout chunk (state=3): >>><<< 19665 1727204157.65583: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204157.65589: handler run complete 19665 1727204157.65688: variable 'ansible_facts' from source: unknown 19665 1727204157.65767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204157.66002: variable 'ansible_facts' from source: unknown 19665 1727204157.66088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204157.66284: attempt loop complete, returning result 19665 1727204157.66294: _execute() done 19665 1727204157.66301: dumping result to json 19665 1727204157.66363: done dumping result, returning 19665 1727204157.66380: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-0dcc-3ea6-000000000192] 19665 1727204157.66389: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000192 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204157.67140: no more pending results, returning what we have 19665 1727204157.67143: results queue empty 19665 1727204157.67144: checking for any_errors_fatal 19665 1727204157.67149: done checking for any_errors_fatal 19665 1727204157.67150: checking for max_fail_percentage 19665 1727204157.67151: done checking for max_fail_percentage 19665 1727204157.67152: checking to see if all hosts have failed and the running result is not ok 19665 1727204157.67153: done checking to see if all hosts have failed 19665 1727204157.67154: getting the remaining hosts for this loop 19665 1727204157.67155: done getting the remaining hosts for this loop 19665 1727204157.67159: getting the next task for host managed-node3 19665 1727204157.67166: done getting next task for host managed-node3 19665 1727204157.67170: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 19665 1727204157.67173: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204157.67187: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000192 19665 1727204157.67190: WORKER PROCESS EXITING 19665 1727204157.67196: getting variables 19665 1727204157.67198: in VariableManager get_vars() 19665 1727204157.67234: Calling all_inventory to load vars for managed-node3 19665 1727204157.67239: Calling groups_inventory to load vars for managed-node3 19665 1727204157.67243: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204157.67254: Calling all_plugins_play to load vars for managed-node3 19665 1727204157.67257: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204157.67261: Calling groups_plugins_play to load vars for managed-node3 19665 1727204157.67641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204157.67956: done with get_vars() 19665 1727204157.67968: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:55:57 -0400 (0:00:01.787) 0:00:08.547 ***** 19665 1727204157.68038: entering _queue_task() for managed-node3/package_facts 19665 1727204157.68039: Creating lock for package_facts 19665 1727204157.68252: worker is 1 (out of 1 available) 19665 1727204157.68269: exiting _queue_task() for managed-node3/package_facts 19665 1727204157.68280: done queuing things up, now waiting for results queue to drain 19665 1727204157.68282: waiting for pending results... 19665 1727204157.68447: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 19665 1727204157.68548: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000193 19665 1727204157.68562: variable 'ansible_search_path' from source: unknown 19665 1727204157.68577: variable 'ansible_search_path' from source: unknown 19665 1727204157.68620: calling self._execute() 19665 1727204157.68688: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204157.68695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204157.68704: variable 'omit' from source: magic vars 19665 1727204157.69004: variable 'ansible_distribution_major_version' from source: facts 19665 1727204157.69017: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204157.69025: variable 'omit' from source: magic vars 19665 1727204157.69071: variable 'omit' from source: magic vars 19665 1727204157.69097: variable 'omit' from source: magic vars 19665 1727204157.69129: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204157.69166: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204157.69186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204157.69200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204157.69208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204157.69233: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204157.69238: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204157.69241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204157.69307: Set connection var ansible_connection to ssh 19665 1727204157.69313: Set connection var ansible_shell_type to sh 19665 1727204157.69319: Set connection var ansible_timeout to 10 19665 1727204157.69324: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204157.69332: Set connection var ansible_shell_executable to /bin/sh 19665 1727204157.69341: Set connection var ansible_pipelining to False 19665 1727204157.69357: variable 'ansible_shell_executable' from source: unknown 19665 1727204157.69360: variable 'ansible_connection' from source: unknown 19665 1727204157.69363: variable 'ansible_module_compression' from source: unknown 19665 1727204157.69365: variable 'ansible_shell_type' from source: unknown 19665 1727204157.69369: variable 'ansible_shell_executable' from source: unknown 19665 1727204157.69371: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204157.69373: variable 'ansible_pipelining' from source: unknown 19665 1727204157.69375: variable 'ansible_timeout' from source: unknown 19665 1727204157.69379: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204157.69526: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204157.69535: variable 'omit' from source: magic vars 19665 1727204157.69542: starting attempt loop 19665 1727204157.69545: running the handler 19665 1727204157.69556: _low_level_execute_command(): starting 19665 1727204157.69563: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204157.71301: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204157.71361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204157.71466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204157.71496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204157.71604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204157.71618: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204157.71631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204157.71651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204157.71663: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204157.71679: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204157.71692: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204157.71711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204157.71729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204157.71747: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204157.71760: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204157.71791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204157.72050: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204157.72180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204157.72200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204157.72289: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204157.73925: stdout chunk (state=3): >>>/root <<< 19665 1727204157.74054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204157.74974: stderr chunk (state=3): >>><<< 19665 1727204157.75003: stdout chunk (state=3): >>><<< 19665 1727204157.75151: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204157.75155: _low_level_execute_command(): starting 19665 1727204157.75158: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204157.7503133-20403-118088389022499 `" && echo ansible-tmp-1727204157.7503133-20403-118088389022499="` echo /root/.ansible/tmp/ansible-tmp-1727204157.7503133-20403-118088389022499 `" ) && sleep 0' 19665 1727204157.75790: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204157.75809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204157.75825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204157.75847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204157.75892: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204157.75916: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204157.75932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204157.75954: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204157.75992: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204157.76005: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204157.76022: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204157.76040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204157.76058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204157.76074: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204157.76086: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204157.76101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204157.76185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204157.76208: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204157.76225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204157.76308: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204157.78150: stdout chunk (state=3): >>>ansible-tmp-1727204157.7503133-20403-118088389022499=/root/.ansible/tmp/ansible-tmp-1727204157.7503133-20403-118088389022499 <<< 19665 1727204157.78359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204157.78362: stdout chunk (state=3): >>><<< 19665 1727204157.78367: stderr chunk (state=3): >>><<< 19665 1727204157.78841: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204157.7503133-20403-118088389022499=/root/.ansible/tmp/ansible-tmp-1727204157.7503133-20403-118088389022499 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204157.78844: variable 'ansible_module_compression' from source: unknown 19665 1727204157.78847: ANSIBALLZ: Using lock for package_facts 19665 1727204157.78849: ANSIBALLZ: Acquiring lock 19665 1727204157.78851: ANSIBALLZ: Lock acquired: 140619594859248 19665 1727204157.78853: ANSIBALLZ: Creating module 19665 1727204158.18445: ANSIBALLZ: Writing module into payload 19665 1727204158.18696: ANSIBALLZ: Writing module 19665 1727204158.18745: ANSIBALLZ: Renaming module 19665 1727204158.18761: ANSIBALLZ: Done creating module 19665 1727204158.18793: variable 'ansible_facts' from source: unknown 19665 1727204158.18940: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204157.7503133-20403-118088389022499/AnsiballZ_package_facts.py 19665 1727204158.19671: Sending initial data 19665 1727204158.19674: Sent initial data (162 bytes) 19665 1727204158.20842: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204158.20859: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204158.20879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204158.20898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204158.20945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204158.20960: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204158.20981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204158.21000: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204158.21012: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204158.21023: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204158.21039: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204158.21055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204158.21077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204158.21090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204158.21102: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204158.21117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204158.21200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204158.21224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204158.21246: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204158.21326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204158.23143: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204158.23189: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204158.23224: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpxxeij8oj /root/.ansible/tmp/ansible-tmp-1727204157.7503133-20403-118088389022499/AnsiballZ_package_facts.py <<< 19665 1727204158.23240: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204158.26051: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204158.26173: stderr chunk (state=3): >>><<< 19665 1727204158.26176: stdout chunk (state=3): >>><<< 19665 1727204158.26178: done transferring module to remote 19665 1727204158.26181: _low_level_execute_command(): starting 19665 1727204158.26183: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204157.7503133-20403-118088389022499/ /root/.ansible/tmp/ansible-tmp-1727204157.7503133-20403-118088389022499/AnsiballZ_package_facts.py && sleep 0' 19665 1727204158.27654: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204158.27676: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204158.27689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204158.27708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204158.27752: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204158.27812: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204158.27825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204158.27843: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204158.27853: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204158.27861: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204158.27872: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204158.27883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204158.27894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204158.27918: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204158.27927: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204158.27940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204158.28016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204158.28153: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204158.28174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204158.28255: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204158.30122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204158.30126: stdout chunk (state=3): >>><<< 19665 1727204158.30129: stderr chunk (state=3): >>><<< 19665 1727204158.30235: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204158.30243: _low_level_execute_command(): starting 19665 1727204158.30246: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204157.7503133-20403-118088389022499/AnsiballZ_package_facts.py && sleep 0' 19665 1727204158.30873: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204158.30891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204158.30908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204158.30926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204158.31125: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204158.31977: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204158.31994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204158.32012: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204158.32026: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204158.32039: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204158.32052: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204158.32069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204158.32089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204158.32100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204158.32111: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204158.32123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204158.32226: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204158.32261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204158.32292: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204158.32521: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204158.78998: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version<<< 19665 1727204158.79139: stdout chunk (state=3): >>>": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "p<<< 19665 1727204158.79155: stdout chunk (state=3): >>>erl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 19665 1727204158.80744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204158.80748: stdout chunk (state=3): >>><<< 19665 1727204158.80750: stderr chunk (state=3): >>><<< 19665 1727204158.80976: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204158.87608: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204157.7503133-20403-118088389022499/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204158.87776: _low_level_execute_command(): starting 19665 1727204158.87788: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204157.7503133-20403-118088389022499/ > /dev/null 2>&1 && sleep 0' 19665 1727204158.89854: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204158.89858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204158.90006: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204158.90012: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204158.90016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204158.90300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204158.90304: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204158.90407: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204158.90621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204158.92350: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204158.92440: stderr chunk (state=3): >>><<< 19665 1727204158.92444: stdout chunk (state=3): >>><<< 19665 1727204158.92574: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204158.92578: handler run complete 19665 1727204158.94026: variable 'ansible_facts' from source: unknown 19665 1727204158.95250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204159.00551: variable 'ansible_facts' from source: unknown 19665 1727204159.01571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204159.03422: attempt loop complete, returning result 19665 1727204159.03468: _execute() done 19665 1727204159.03506: dumping result to json 19665 1727204159.04152: done dumping result, returning 19665 1727204159.04175: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-0dcc-3ea6-000000000193] 19665 1727204159.04186: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000193 19665 1727204159.08383: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000193 19665 1727204159.08387: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204159.08494: no more pending results, returning what we have 19665 1727204159.08497: results queue empty 19665 1727204159.08499: checking for any_errors_fatal 19665 1727204159.08502: done checking for any_errors_fatal 19665 1727204159.08503: checking for max_fail_percentage 19665 1727204159.08505: done checking for max_fail_percentage 19665 1727204159.08505: checking to see if all hosts have failed and the running result is not ok 19665 1727204159.08507: done checking to see if all hosts have failed 19665 1727204159.08507: getting the remaining hosts for this loop 19665 1727204159.08509: done getting the remaining hosts for this loop 19665 1727204159.08513: getting the next task for host managed-node3 19665 1727204159.08520: done getting next task for host managed-node3 19665 1727204159.08524: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 19665 1727204159.08526: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204159.08538: getting variables 19665 1727204159.08540: in VariableManager get_vars() 19665 1727204159.08574: Calling all_inventory to load vars for managed-node3 19665 1727204159.08577: Calling groups_inventory to load vars for managed-node3 19665 1727204159.08580: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204159.08590: Calling all_plugins_play to load vars for managed-node3 19665 1727204159.08592: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204159.08595: Calling groups_plugins_play to load vars for managed-node3 19665 1727204159.10855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204159.14341: done with get_vars() 19665 1727204159.14376: done getting variables 19665 1727204159.14443: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:55:59 -0400 (0:00:01.466) 0:00:10.013 ***** 19665 1727204159.14680: entering _queue_task() for managed-node3/debug 19665 1727204159.14996: worker is 1 (out of 1 available) 19665 1727204159.15010: exiting _queue_task() for managed-node3/debug 19665 1727204159.15023: done queuing things up, now waiting for results queue to drain 19665 1727204159.15024: waiting for pending results... 19665 1727204159.15998: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 19665 1727204159.16213: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000015 19665 1727204159.16233: variable 'ansible_search_path' from source: unknown 19665 1727204159.16244: variable 'ansible_search_path' from source: unknown 19665 1727204159.16286: calling self._execute() 19665 1727204159.16570: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204159.16582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204159.16595: variable 'omit' from source: magic vars 19665 1727204159.17366: variable 'ansible_distribution_major_version' from source: facts 19665 1727204159.17409: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204159.17511: variable 'omit' from source: magic vars 19665 1727204159.17553: variable 'omit' from source: magic vars 19665 1727204159.17743: variable 'network_provider' from source: set_fact 19665 1727204159.17849: variable 'omit' from source: magic vars 19665 1727204159.17897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204159.18055: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204159.18082: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204159.18104: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204159.18118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204159.18186: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204159.18271: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204159.18281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204159.18496: Set connection var ansible_connection to ssh 19665 1727204159.18503: Set connection var ansible_shell_type to sh 19665 1727204159.18513: Set connection var ansible_timeout to 10 19665 1727204159.18522: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204159.18534: Set connection var ansible_shell_executable to /bin/sh 19665 1727204159.18548: Set connection var ansible_pipelining to False 19665 1727204159.18578: variable 'ansible_shell_executable' from source: unknown 19665 1727204159.18676: variable 'ansible_connection' from source: unknown 19665 1727204159.18684: variable 'ansible_module_compression' from source: unknown 19665 1727204159.18690: variable 'ansible_shell_type' from source: unknown 19665 1727204159.18701: variable 'ansible_shell_executable' from source: unknown 19665 1727204159.18712: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204159.18719: variable 'ansible_pipelining' from source: unknown 19665 1727204159.18726: variable 'ansible_timeout' from source: unknown 19665 1727204159.18733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204159.19079: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204159.19095: variable 'omit' from source: magic vars 19665 1727204159.19106: starting attempt loop 19665 1727204159.19112: running the handler 19665 1727204159.19273: handler run complete 19665 1727204159.19292: attempt loop complete, returning result 19665 1727204159.19298: _execute() done 19665 1727204159.19305: dumping result to json 19665 1727204159.19311: done dumping result, returning 19665 1727204159.19322: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-0dcc-3ea6-000000000015] 19665 1727204159.19330: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000015 19665 1727204159.19448: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000015 ok: [managed-node3] => {} MSG: Using network provider: nm 19665 1727204159.19521: no more pending results, returning what we have 19665 1727204159.19525: results queue empty 19665 1727204159.19526: checking for any_errors_fatal 19665 1727204159.19536: done checking for any_errors_fatal 19665 1727204159.19537: checking for max_fail_percentage 19665 1727204159.19538: done checking for max_fail_percentage 19665 1727204159.19539: checking to see if all hosts have failed and the running result is not ok 19665 1727204159.19540: done checking to see if all hosts have failed 19665 1727204159.19541: getting the remaining hosts for this loop 19665 1727204159.19543: done getting the remaining hosts for this loop 19665 1727204159.19547: getting the next task for host managed-node3 19665 1727204159.19553: done getting next task for host managed-node3 19665 1727204159.19557: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19665 1727204159.19559: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204159.19570: getting variables 19665 1727204159.19572: in VariableManager get_vars() 19665 1727204159.19609: Calling all_inventory to load vars for managed-node3 19665 1727204159.19612: Calling groups_inventory to load vars for managed-node3 19665 1727204159.19614: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204159.19626: Calling all_plugins_play to load vars for managed-node3 19665 1727204159.19629: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204159.19632: Calling groups_plugins_play to load vars for managed-node3 19665 1727204159.20988: WORKER PROCESS EXITING 19665 1727204159.21466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204159.25808: done with get_vars() 19665 1727204159.25842: done getting variables 19665 1727204159.25942: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.114) 0:00:10.128 ***** 19665 1727204159.26156: entering _queue_task() for managed-node3/fail 19665 1727204159.26158: Creating lock for fail 19665 1727204159.26547: worker is 1 (out of 1 available) 19665 1727204159.26568: exiting _queue_task() for managed-node3/fail 19665 1727204159.26580: done queuing things up, now waiting for results queue to drain 19665 1727204159.26582: waiting for pending results... 19665 1727204159.26851: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19665 1727204159.26979: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000016 19665 1727204159.27006: variable 'ansible_search_path' from source: unknown 19665 1727204159.27017: variable 'ansible_search_path' from source: unknown 19665 1727204159.27060: calling self._execute() 19665 1727204159.27197: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204159.27209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204159.27298: variable 'omit' from source: magic vars 19665 1727204159.28187: variable 'ansible_distribution_major_version' from source: facts 19665 1727204159.28212: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204159.29251: variable 'network_state' from source: role '' defaults 19665 1727204159.29271: Evaluated conditional (network_state != {}): False 19665 1727204159.29280: when evaluation is False, skipping this task 19665 1727204159.29287: _execute() done 19665 1727204159.29293: dumping result to json 19665 1727204159.29301: done dumping result, returning 19665 1727204159.29312: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-0dcc-3ea6-000000000016] 19665 1727204159.29322: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000016 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204159.29486: no more pending results, returning what we have 19665 1727204159.29491: results queue empty 19665 1727204159.29492: checking for any_errors_fatal 19665 1727204159.29500: done checking for any_errors_fatal 19665 1727204159.29501: checking for max_fail_percentage 19665 1727204159.29503: done checking for max_fail_percentage 19665 1727204159.29504: checking to see if all hosts have failed and the running result is not ok 19665 1727204159.29505: done checking to see if all hosts have failed 19665 1727204159.29505: getting the remaining hosts for this loop 19665 1727204159.29507: done getting the remaining hosts for this loop 19665 1727204159.29511: getting the next task for host managed-node3 19665 1727204159.29518: done getting next task for host managed-node3 19665 1727204159.29523: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19665 1727204159.29525: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204159.29540: getting variables 19665 1727204159.29542: in VariableManager get_vars() 19665 1727204159.29583: Calling all_inventory to load vars for managed-node3 19665 1727204159.29587: Calling groups_inventory to load vars for managed-node3 19665 1727204159.29589: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204159.29602: Calling all_plugins_play to load vars for managed-node3 19665 1727204159.29605: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204159.29608: Calling groups_plugins_play to load vars for managed-node3 19665 1727204159.30973: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000016 19665 1727204159.30977: WORKER PROCESS EXITING 19665 1727204159.32608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204159.36162: done with get_vars() 19665 1727204159.36319: done getting variables 19665 1727204159.36385: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.103) 0:00:10.231 ***** 19665 1727204159.36534: entering _queue_task() for managed-node3/fail 19665 1727204159.37126: worker is 1 (out of 1 available) 19665 1727204159.37140: exiting _queue_task() for managed-node3/fail 19665 1727204159.37152: done queuing things up, now waiting for results queue to drain 19665 1727204159.37184: waiting for pending results... 19665 1727204159.37703: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19665 1727204159.38154: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000017 19665 1727204159.38201: variable 'ansible_search_path' from source: unknown 19665 1727204159.38210: variable 'ansible_search_path' from source: unknown 19665 1727204159.38284: calling self._execute() 19665 1727204159.38471: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204159.38571: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204159.38604: variable 'omit' from source: magic vars 19665 1727204159.39509: variable 'ansible_distribution_major_version' from source: facts 19665 1727204159.39530: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204159.39840: variable 'network_state' from source: role '' defaults 19665 1727204159.39857: Evaluated conditional (network_state != {}): False 19665 1727204159.39905: when evaluation is False, skipping this task 19665 1727204159.39914: _execute() done 19665 1727204159.39923: dumping result to json 19665 1727204159.39961: done dumping result, returning 19665 1727204159.39991: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-0dcc-3ea6-000000000017] 19665 1727204159.40120: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000017 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204159.41017: no more pending results, returning what we have 19665 1727204159.41021: results queue empty 19665 1727204159.41022: checking for any_errors_fatal 19665 1727204159.41032: done checking for any_errors_fatal 19665 1727204159.41033: checking for max_fail_percentage 19665 1727204159.41035: done checking for max_fail_percentage 19665 1727204159.41036: checking to see if all hosts have failed and the running result is not ok 19665 1727204159.41039: done checking to see if all hosts have failed 19665 1727204159.41040: getting the remaining hosts for this loop 19665 1727204159.41042: done getting the remaining hosts for this loop 19665 1727204159.41047: getting the next task for host managed-node3 19665 1727204159.41055: done getting next task for host managed-node3 19665 1727204159.41060: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19665 1727204159.41066: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204159.41082: getting variables 19665 1727204159.41084: in VariableManager get_vars() 19665 1727204159.41127: Calling all_inventory to load vars for managed-node3 19665 1727204159.41130: Calling groups_inventory to load vars for managed-node3 19665 1727204159.41133: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204159.41149: Calling all_plugins_play to load vars for managed-node3 19665 1727204159.41152: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204159.41156: Calling groups_plugins_play to load vars for managed-node3 19665 1727204159.43516: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000017 19665 1727204159.43521: WORKER PROCESS EXITING 19665 1727204159.45632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204159.47691: done with get_vars() 19665 1727204159.47721: done getting variables 19665 1727204159.47794: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.112) 0:00:10.344 ***** 19665 1727204159.47826: entering _queue_task() for managed-node3/fail 19665 1727204159.48607: worker is 1 (out of 1 available) 19665 1727204159.48683: exiting _queue_task() for managed-node3/fail 19665 1727204159.48767: done queuing things up, now waiting for results queue to drain 19665 1727204159.48770: waiting for pending results... 19665 1727204159.49831: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19665 1727204159.49952: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000018 19665 1727204159.49974: variable 'ansible_search_path' from source: unknown 19665 1727204159.49978: variable 'ansible_search_path' from source: unknown 19665 1727204159.50012: calling self._execute() 19665 1727204159.50111: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204159.50115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204159.50126: variable 'omit' from source: magic vars 19665 1727204159.50504: variable 'ansible_distribution_major_version' from source: facts 19665 1727204159.50522: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204159.50796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204159.58041: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204159.58125: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204159.58181: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204159.58255: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204159.58385: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204159.58482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204159.58883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204159.58962: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204159.59178: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204159.59193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204159.59503: variable 'ansible_distribution_major_version' from source: facts 19665 1727204159.59520: Evaluated conditional (ansible_distribution_major_version | int > 9): False 19665 1727204159.59523: when evaluation is False, skipping this task 19665 1727204159.59526: _execute() done 19665 1727204159.59528: dumping result to json 19665 1727204159.59531: done dumping result, returning 19665 1727204159.59695: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-0dcc-3ea6-000000000018] 19665 1727204159.59704: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000018 19665 1727204159.59866: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000018 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 19665 1727204159.59992: no more pending results, returning what we have 19665 1727204159.59997: results queue empty 19665 1727204159.59998: checking for any_errors_fatal 19665 1727204159.60005: done checking for any_errors_fatal 19665 1727204159.60006: checking for max_fail_percentage 19665 1727204159.60008: done checking for max_fail_percentage 19665 1727204159.60009: checking to see if all hosts have failed and the running result is not ok 19665 1727204159.60010: done checking to see if all hosts have failed 19665 1727204159.60010: getting the remaining hosts for this loop 19665 1727204159.60012: done getting the remaining hosts for this loop 19665 1727204159.60017: getting the next task for host managed-node3 19665 1727204159.60056: done getting next task for host managed-node3 19665 1727204159.60061: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19665 1727204159.60063: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204159.60081: getting variables 19665 1727204159.60084: in VariableManager get_vars() 19665 1727204159.60123: Calling all_inventory to load vars for managed-node3 19665 1727204159.60126: Calling groups_inventory to load vars for managed-node3 19665 1727204159.60128: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204159.60142: Calling all_plugins_play to load vars for managed-node3 19665 1727204159.60145: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204159.60181: Calling groups_plugins_play to load vars for managed-node3 19665 1727204159.61280: WORKER PROCESS EXITING 19665 1727204159.63151: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204159.67925: done with get_vars() 19665 1727204159.67966: done getting variables 19665 1727204159.68079: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.202) 0:00:10.547 ***** 19665 1727204159.68113: entering _queue_task() for managed-node3/dnf 19665 1727204159.68479: worker is 1 (out of 1 available) 19665 1727204159.68493: exiting _queue_task() for managed-node3/dnf 19665 1727204159.68505: done queuing things up, now waiting for results queue to drain 19665 1727204159.68507: waiting for pending results... 19665 1727204159.69217: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19665 1727204159.69367: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000019 19665 1727204159.69379: variable 'ansible_search_path' from source: unknown 19665 1727204159.69382: variable 'ansible_search_path' from source: unknown 19665 1727204159.69454: calling self._execute() 19665 1727204159.69587: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204159.69590: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204159.69600: variable 'omit' from source: magic vars 19665 1727204159.70112: variable 'ansible_distribution_major_version' from source: facts 19665 1727204159.70124: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204159.70381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204159.74318: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204159.74416: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204159.74471: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204159.74542: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204159.74583: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204159.74728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204159.74761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204159.74800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204159.74861: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204159.74890: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204159.75042: variable 'ansible_distribution' from source: facts 19665 1727204159.75046: variable 'ansible_distribution_major_version' from source: facts 19665 1727204159.75060: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 19665 1727204159.75198: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204159.75346: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204159.75389: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204159.75418: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204159.75506: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204159.75513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204159.75605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204159.75652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204159.75705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204159.75741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204159.75757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204159.75810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204159.75855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204159.75904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204159.75951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204159.76007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204159.76239: variable 'network_connections' from source: play vars 19665 1727204159.76258: variable 'interface' from source: set_fact 19665 1727204159.76372: variable 'interface' from source: set_fact 19665 1727204159.76381: variable 'interface' from source: set_fact 19665 1727204159.76453: variable 'interface' from source: set_fact 19665 1727204159.76531: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204159.76786: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204159.76858: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204159.76882: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204159.76928: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204159.77004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204159.77022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204159.77051: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204159.77091: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204159.77176: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204159.77566: variable 'network_connections' from source: play vars 19665 1727204159.77575: variable 'interface' from source: set_fact 19665 1727204159.77634: variable 'interface' from source: set_fact 19665 1727204159.77643: variable 'interface' from source: set_fact 19665 1727204159.77709: variable 'interface' from source: set_fact 19665 1727204159.77743: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19665 1727204159.77746: when evaluation is False, skipping this task 19665 1727204159.77749: _execute() done 19665 1727204159.77751: dumping result to json 19665 1727204159.77753: done dumping result, returning 19665 1727204159.77769: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-0dcc-3ea6-000000000019] 19665 1727204159.77778: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000019 19665 1727204159.77870: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000019 19665 1727204159.77875: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19665 1727204159.77926: no more pending results, returning what we have 19665 1727204159.77931: results queue empty 19665 1727204159.77932: checking for any_errors_fatal 19665 1727204159.77940: done checking for any_errors_fatal 19665 1727204159.77941: checking for max_fail_percentage 19665 1727204159.77943: done checking for max_fail_percentage 19665 1727204159.77944: checking to see if all hosts have failed and the running result is not ok 19665 1727204159.77945: done checking to see if all hosts have failed 19665 1727204159.77946: getting the remaining hosts for this loop 19665 1727204159.77948: done getting the remaining hosts for this loop 19665 1727204159.77953: getting the next task for host managed-node3 19665 1727204159.77961: done getting next task for host managed-node3 19665 1727204159.77967: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19665 1727204159.77969: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204159.77985: getting variables 19665 1727204159.77987: in VariableManager get_vars() 19665 1727204159.78027: Calling all_inventory to load vars for managed-node3 19665 1727204159.78029: Calling groups_inventory to load vars for managed-node3 19665 1727204159.78032: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204159.78045: Calling all_plugins_play to load vars for managed-node3 19665 1727204159.78048: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204159.78051: Calling groups_plugins_play to load vars for managed-node3 19665 1727204159.80512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204159.83217: done with get_vars() 19665 1727204159.83258: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19665 1727204159.83345: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.152) 0:00:10.700 ***** 19665 1727204159.83388: entering _queue_task() for managed-node3/yum 19665 1727204159.83390: Creating lock for yum 19665 1727204159.83723: worker is 1 (out of 1 available) 19665 1727204159.83736: exiting _queue_task() for managed-node3/yum 19665 1727204159.83751: done queuing things up, now waiting for results queue to drain 19665 1727204159.83752: waiting for pending results... 19665 1727204159.84034: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19665 1727204159.84150: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000001a 19665 1727204159.84162: variable 'ansible_search_path' from source: unknown 19665 1727204159.84165: variable 'ansible_search_path' from source: unknown 19665 1727204159.84205: calling self._execute() 19665 1727204159.84320: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204159.84326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204159.84343: variable 'omit' from source: magic vars 19665 1727204159.84894: variable 'ansible_distribution_major_version' from source: facts 19665 1727204159.84906: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204159.85107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204159.88732: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204159.88809: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204159.88849: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204159.88891: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204159.89030: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204159.89226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204159.89261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204159.89289: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204159.89448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204159.89464: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204159.89678: variable 'ansible_distribution_major_version' from source: facts 19665 1727204159.89693: Evaluated conditional (ansible_distribution_major_version | int < 8): False 19665 1727204159.89697: when evaluation is False, skipping this task 19665 1727204159.89700: _execute() done 19665 1727204159.89703: dumping result to json 19665 1727204159.89706: done dumping result, returning 19665 1727204159.89716: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-0dcc-3ea6-00000000001a] 19665 1727204159.89721: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000001a 19665 1727204159.89832: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000001a 19665 1727204159.89836: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 19665 1727204159.89892: no more pending results, returning what we have 19665 1727204159.89896: results queue empty 19665 1727204159.89897: checking for any_errors_fatal 19665 1727204159.89903: done checking for any_errors_fatal 19665 1727204159.89904: checking for max_fail_percentage 19665 1727204159.89906: done checking for max_fail_percentage 19665 1727204159.89907: checking to see if all hosts have failed and the running result is not ok 19665 1727204159.89908: done checking to see if all hosts have failed 19665 1727204159.89908: getting the remaining hosts for this loop 19665 1727204159.89911: done getting the remaining hosts for this loop 19665 1727204159.89915: getting the next task for host managed-node3 19665 1727204159.89921: done getting next task for host managed-node3 19665 1727204159.89925: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19665 1727204159.89927: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204159.89943: getting variables 19665 1727204159.89944: in VariableManager get_vars() 19665 1727204159.89983: Calling all_inventory to load vars for managed-node3 19665 1727204159.89986: Calling groups_inventory to load vars for managed-node3 19665 1727204159.89988: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204159.89999: Calling all_plugins_play to load vars for managed-node3 19665 1727204159.90002: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204159.90005: Calling groups_plugins_play to load vars for managed-node3 19665 1727204159.93081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204159.96486: done with get_vars() 19665 1727204159.96510: done getting variables 19665 1727204159.96575: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:55:59 -0400 (0:00:00.132) 0:00:10.832 ***** 19665 1727204159.96607: entering _queue_task() for managed-node3/fail 19665 1727204159.97646: worker is 1 (out of 1 available) 19665 1727204159.97661: exiting _queue_task() for managed-node3/fail 19665 1727204159.97677: done queuing things up, now waiting for results queue to drain 19665 1727204159.97679: waiting for pending results... 19665 1727204159.98623: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19665 1727204159.98756: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000001b 19665 1727204159.98859: variable 'ansible_search_path' from source: unknown 19665 1727204159.98866: variable 'ansible_search_path' from source: unknown 19665 1727204159.98936: calling self._execute() 19665 1727204159.99045: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204159.99050: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204159.99060: variable 'omit' from source: magic vars 19665 1727204159.99570: variable 'ansible_distribution_major_version' from source: facts 19665 1727204159.99583: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204159.99709: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204159.99916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204160.02804: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204160.02879: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204160.02914: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204160.02952: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204160.02980: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204160.03063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.03101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.03126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.03175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.03191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.03235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.03260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.03290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.03328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.03343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.03388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.03410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.03434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.03477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.03495: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.03675: variable 'network_connections' from source: play vars 19665 1727204160.03687: variable 'interface' from source: set_fact 19665 1727204160.03771: variable 'interface' from source: set_fact 19665 1727204160.03780: variable 'interface' from source: set_fact 19665 1727204160.03848: variable 'interface' from source: set_fact 19665 1727204160.03918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204160.04111: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204160.04152: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204160.04185: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204160.04214: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204160.04259: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204160.04283: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204160.04307: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.04331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204160.04398: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204160.04653: variable 'network_connections' from source: play vars 19665 1727204160.04656: variable 'interface' from source: set_fact 19665 1727204160.04725: variable 'interface' from source: set_fact 19665 1727204160.04731: variable 'interface' from source: set_fact 19665 1727204160.04793: variable 'interface' from source: set_fact 19665 1727204160.04830: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19665 1727204160.04833: when evaluation is False, skipping this task 19665 1727204160.04835: _execute() done 19665 1727204160.04838: dumping result to json 19665 1727204160.04844: done dumping result, returning 19665 1727204160.04852: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0dcc-3ea6-00000000001b] 19665 1727204160.04863: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000001b 19665 1727204160.04950: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000001b 19665 1727204160.04954: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19665 1727204160.05005: no more pending results, returning what we have 19665 1727204160.05009: results queue empty 19665 1727204160.05011: checking for any_errors_fatal 19665 1727204160.05019: done checking for any_errors_fatal 19665 1727204160.05020: checking for max_fail_percentage 19665 1727204160.05022: done checking for max_fail_percentage 19665 1727204160.05023: checking to see if all hosts have failed and the running result is not ok 19665 1727204160.05024: done checking to see if all hosts have failed 19665 1727204160.05024: getting the remaining hosts for this loop 19665 1727204160.05026: done getting the remaining hosts for this loop 19665 1727204160.05030: getting the next task for host managed-node3 19665 1727204160.05040: done getting next task for host managed-node3 19665 1727204160.05045: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 19665 1727204160.05047: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204160.05060: getting variables 19665 1727204160.05062: in VariableManager get_vars() 19665 1727204160.05103: Calling all_inventory to load vars for managed-node3 19665 1727204160.05105: Calling groups_inventory to load vars for managed-node3 19665 1727204160.05108: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204160.05119: Calling all_plugins_play to load vars for managed-node3 19665 1727204160.05122: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204160.05125: Calling groups_plugins_play to load vars for managed-node3 19665 1727204160.09587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204160.12842: done with get_vars() 19665 1727204160.12877: done getting variables 19665 1727204160.12938: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:00 -0400 (0:00:00.163) 0:00:10.996 ***** 19665 1727204160.12971: entering _queue_task() for managed-node3/package 19665 1727204160.13685: worker is 1 (out of 1 available) 19665 1727204160.13697: exiting _queue_task() for managed-node3/package 19665 1727204160.13709: done queuing things up, now waiting for results queue to drain 19665 1727204160.13711: waiting for pending results... 19665 1727204160.15577: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 19665 1727204160.15973: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000001c 19665 1727204160.15993: variable 'ansible_search_path' from source: unknown 19665 1727204160.16001: variable 'ansible_search_path' from source: unknown 19665 1727204160.16049: calling self._execute() 19665 1727204160.16265: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204160.16278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204160.16404: variable 'omit' from source: magic vars 19665 1727204160.18055: variable 'ansible_distribution_major_version' from source: facts 19665 1727204160.18102: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204160.18630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204160.19145: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204160.19325: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204160.19367: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204160.19515: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204160.19754: variable 'network_packages' from source: role '' defaults 19665 1727204160.19891: variable '__network_provider_setup' from source: role '' defaults 19665 1727204160.20063: variable '__network_service_name_default_nm' from source: role '' defaults 19665 1727204160.20143: variable '__network_service_name_default_nm' from source: role '' defaults 19665 1727204160.20175: variable '__network_packages_default_nm' from source: role '' defaults 19665 1727204160.20328: variable '__network_packages_default_nm' from source: role '' defaults 19665 1727204160.20728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204160.23950: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204160.24042: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204160.24095: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204160.24131: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204160.24168: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204160.24265: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.24313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.24347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.24496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.24516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.24569: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.24606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.24639: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.24687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.24713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.25141: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19665 1727204160.25381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.25490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.25520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.25688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.25709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.25910: variable 'ansible_python' from source: facts 19665 1727204160.25945: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19665 1727204160.26142: variable '__network_wpa_supplicant_required' from source: role '' defaults 19665 1727204160.26309: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19665 1727204160.26558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.26603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.26636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.26696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.26720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.26782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.26824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.26857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.26910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.26934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.27104: variable 'network_connections' from source: play vars 19665 1727204160.27114: variable 'interface' from source: set_fact 19665 1727204160.27229: variable 'interface' from source: set_fact 19665 1727204160.27251: variable 'interface' from source: set_fact 19665 1727204160.27412: variable 'interface' from source: set_fact 19665 1727204160.27658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204160.27698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204160.27786: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.27825: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204160.27951: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204160.28772: variable 'network_connections' from source: play vars 19665 1727204160.28783: variable 'interface' from source: set_fact 19665 1727204160.29007: variable 'interface' from source: set_fact 19665 1727204160.29033: variable 'interface' from source: set_fact 19665 1727204160.29190: variable 'interface' from source: set_fact 19665 1727204160.29252: variable '__network_packages_default_wireless' from source: role '' defaults 19665 1727204160.29353: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204160.29882: variable 'network_connections' from source: play vars 19665 1727204160.29950: variable 'interface' from source: set_fact 19665 1727204160.30107: variable 'interface' from source: set_fact 19665 1727204160.30119: variable 'interface' from source: set_fact 19665 1727204160.30309: variable 'interface' from source: set_fact 19665 1727204160.30342: variable '__network_packages_default_team' from source: role '' defaults 19665 1727204160.30551: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204160.31019: variable 'network_connections' from source: play vars 19665 1727204160.31030: variable 'interface' from source: set_fact 19665 1727204160.31104: variable 'interface' from source: set_fact 19665 1727204160.31115: variable 'interface' from source: set_fact 19665 1727204160.31184: variable 'interface' from source: set_fact 19665 1727204160.31374: variable '__network_service_name_default_initscripts' from source: role '' defaults 19665 1727204160.31557: variable '__network_service_name_default_initscripts' from source: role '' defaults 19665 1727204160.31575: variable '__network_packages_default_initscripts' from source: role '' defaults 19665 1727204160.31647: variable '__network_packages_default_initscripts' from source: role '' defaults 19665 1727204160.32823: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19665 1727204160.33385: variable 'network_connections' from source: play vars 19665 1727204160.33398: variable 'interface' from source: set_fact 19665 1727204160.33461: variable 'interface' from source: set_fact 19665 1727204160.33474: variable 'interface' from source: set_fact 19665 1727204160.33541: variable 'interface' from source: set_fact 19665 1727204160.33555: variable 'ansible_distribution' from source: facts 19665 1727204160.33563: variable '__network_rh_distros' from source: role '' defaults 19665 1727204160.33575: variable 'ansible_distribution_major_version' from source: facts 19665 1727204160.33611: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19665 1727204160.33780: variable 'ansible_distribution' from source: facts 19665 1727204160.33789: variable '__network_rh_distros' from source: role '' defaults 19665 1727204160.33797: variable 'ansible_distribution_major_version' from source: facts 19665 1727204160.33810: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19665 1727204160.33982: variable 'ansible_distribution' from source: facts 19665 1727204160.33991: variable '__network_rh_distros' from source: role '' defaults 19665 1727204160.34000: variable 'ansible_distribution_major_version' from source: facts 19665 1727204160.34046: variable 'network_provider' from source: set_fact 19665 1727204160.34069: variable 'ansible_facts' from source: unknown 19665 1727204160.34707: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 19665 1727204160.34715: when evaluation is False, skipping this task 19665 1727204160.34724: _execute() done 19665 1727204160.34740: dumping result to json 19665 1727204160.34840: done dumping result, returning 19665 1727204160.34854: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-0dcc-3ea6-00000000001c] 19665 1727204160.34866: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000001c skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 19665 1727204160.35136: no more pending results, returning what we have 19665 1727204160.35147: results queue empty 19665 1727204160.35149: checking for any_errors_fatal 19665 1727204160.35158: done checking for any_errors_fatal 19665 1727204160.35159: checking for max_fail_percentage 19665 1727204160.35161: done checking for max_fail_percentage 19665 1727204160.35162: checking to see if all hosts have failed and the running result is not ok 19665 1727204160.35163: done checking to see if all hosts have failed 19665 1727204160.35165: getting the remaining hosts for this loop 19665 1727204160.35875: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000001c 19665 1727204160.35879: WORKER PROCESS EXITING 19665 1727204160.35993: done getting the remaining hosts for this loop 19665 1727204160.35998: getting the next task for host managed-node3 19665 1727204160.36004: done getting next task for host managed-node3 19665 1727204160.36008: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19665 1727204160.36010: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204160.36024: getting variables 19665 1727204160.36025: in VariableManager get_vars() 19665 1727204160.36059: Calling all_inventory to load vars for managed-node3 19665 1727204160.36062: Calling groups_inventory to load vars for managed-node3 19665 1727204160.36066: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204160.36080: Calling all_plugins_play to load vars for managed-node3 19665 1727204160.36085: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204160.36088: Calling groups_plugins_play to load vars for managed-node3 19665 1727204160.38132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204160.40879: done with get_vars() 19665 1727204160.40912: done getting variables 19665 1727204160.40985: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:00 -0400 (0:00:00.280) 0:00:11.276 ***** 19665 1727204160.41017: entering _queue_task() for managed-node3/package 19665 1727204160.41495: worker is 1 (out of 1 available) 19665 1727204160.41508: exiting _queue_task() for managed-node3/package 19665 1727204160.41521: done queuing things up, now waiting for results queue to drain 19665 1727204160.41522: waiting for pending results... 19665 1727204160.42373: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19665 1727204160.42489: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000001d 19665 1727204160.42502: variable 'ansible_search_path' from source: unknown 19665 1727204160.42506: variable 'ansible_search_path' from source: unknown 19665 1727204160.42545: calling self._execute() 19665 1727204160.42642: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204160.42646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204160.42653: variable 'omit' from source: magic vars 19665 1727204160.43051: variable 'ansible_distribution_major_version' from source: facts 19665 1727204160.43065: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204160.43188: variable 'network_state' from source: role '' defaults 19665 1727204160.43197: Evaluated conditional (network_state != {}): False 19665 1727204160.43202: when evaluation is False, skipping this task 19665 1727204160.43205: _execute() done 19665 1727204160.43213: dumping result to json 19665 1727204160.43216: done dumping result, returning 19665 1727204160.43225: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-0dcc-3ea6-00000000001d] 19665 1727204160.43231: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000001d 19665 1727204160.43326: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000001d 19665 1727204160.43329: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204160.43377: no more pending results, returning what we have 19665 1727204160.43382: results queue empty 19665 1727204160.43383: checking for any_errors_fatal 19665 1727204160.43389: done checking for any_errors_fatal 19665 1727204160.43390: checking for max_fail_percentage 19665 1727204160.43392: done checking for max_fail_percentage 19665 1727204160.43392: checking to see if all hosts have failed and the running result is not ok 19665 1727204160.43393: done checking to see if all hosts have failed 19665 1727204160.43394: getting the remaining hosts for this loop 19665 1727204160.43395: done getting the remaining hosts for this loop 19665 1727204160.43399: getting the next task for host managed-node3 19665 1727204160.43405: done getting next task for host managed-node3 19665 1727204160.43409: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19665 1727204160.43411: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204160.43426: getting variables 19665 1727204160.43428: in VariableManager get_vars() 19665 1727204160.43470: Calling all_inventory to load vars for managed-node3 19665 1727204160.43473: Calling groups_inventory to load vars for managed-node3 19665 1727204160.43475: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204160.43484: Calling all_plugins_play to load vars for managed-node3 19665 1727204160.43486: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204160.43489: Calling groups_plugins_play to load vars for managed-node3 19665 1727204160.45298: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204160.47482: done with get_vars() 19665 1727204160.47513: done getting variables 19665 1727204160.47583: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:00 -0400 (0:00:00.065) 0:00:11.342 ***** 19665 1727204160.47614: entering _queue_task() for managed-node3/package 19665 1727204160.47980: worker is 1 (out of 1 available) 19665 1727204160.47992: exiting _queue_task() for managed-node3/package 19665 1727204160.48005: done queuing things up, now waiting for results queue to drain 19665 1727204160.48007: waiting for pending results... 19665 1727204160.48295: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19665 1727204160.48421: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000001e 19665 1727204160.48445: variable 'ansible_search_path' from source: unknown 19665 1727204160.48456: variable 'ansible_search_path' from source: unknown 19665 1727204160.48499: calling self._execute() 19665 1727204160.48597: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204160.48609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204160.48622: variable 'omit' from source: magic vars 19665 1727204160.49012: variable 'ansible_distribution_major_version' from source: facts 19665 1727204160.49031: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204160.49169: variable 'network_state' from source: role '' defaults 19665 1727204160.49184: Evaluated conditional (network_state != {}): False 19665 1727204160.49192: when evaluation is False, skipping this task 19665 1727204160.49199: _execute() done 19665 1727204160.49207: dumping result to json 19665 1727204160.49218: done dumping result, returning 19665 1727204160.49230: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-0dcc-3ea6-00000000001e] 19665 1727204160.49243: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000001e skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204160.49395: no more pending results, returning what we have 19665 1727204160.49400: results queue empty 19665 1727204160.49401: checking for any_errors_fatal 19665 1727204160.49409: done checking for any_errors_fatal 19665 1727204160.49410: checking for max_fail_percentage 19665 1727204160.49412: done checking for max_fail_percentage 19665 1727204160.49413: checking to see if all hosts have failed and the running result is not ok 19665 1727204160.49413: done checking to see if all hosts have failed 19665 1727204160.49414: getting the remaining hosts for this loop 19665 1727204160.49416: done getting the remaining hosts for this loop 19665 1727204160.49420: getting the next task for host managed-node3 19665 1727204160.49428: done getting next task for host managed-node3 19665 1727204160.49432: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19665 1727204160.49434: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204160.49451: getting variables 19665 1727204160.49453: in VariableManager get_vars() 19665 1727204160.49495: Calling all_inventory to load vars for managed-node3 19665 1727204160.49497: Calling groups_inventory to load vars for managed-node3 19665 1727204160.49500: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204160.49512: Calling all_plugins_play to load vars for managed-node3 19665 1727204160.49515: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204160.49518: Calling groups_plugins_play to load vars for managed-node3 19665 1727204160.50691: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000001e 19665 1727204160.50696: WORKER PROCESS EXITING 19665 1727204160.55949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204160.58468: done with get_vars() 19665 1727204160.58494: done getting variables 19665 1727204160.58576: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:00 -0400 (0:00:00.109) 0:00:11.452 ***** 19665 1727204160.58601: entering _queue_task() for managed-node3/service 19665 1727204160.58603: Creating lock for service 19665 1727204160.58945: worker is 1 (out of 1 available) 19665 1727204160.58958: exiting _queue_task() for managed-node3/service 19665 1727204160.58971: done queuing things up, now waiting for results queue to drain 19665 1727204160.58972: waiting for pending results... 19665 1727204160.59228: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19665 1727204160.59352: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000001f 19665 1727204160.59375: variable 'ansible_search_path' from source: unknown 19665 1727204160.59384: variable 'ansible_search_path' from source: unknown 19665 1727204160.59426: calling self._execute() 19665 1727204160.59518: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204160.59534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204160.59549: variable 'omit' from source: magic vars 19665 1727204160.59942: variable 'ansible_distribution_major_version' from source: facts 19665 1727204160.59968: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204160.60091: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204160.60284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204160.62584: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204160.62669: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204160.62710: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204160.62748: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204160.62785: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204160.62863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.62901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.62933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.62984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.63008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.63056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.63088: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.63125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.63172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.63190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.63237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.63269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.63298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.63345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.63366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.63547: variable 'network_connections' from source: play vars 19665 1727204160.63567: variable 'interface' from source: set_fact 19665 1727204160.63653: variable 'interface' from source: set_fact 19665 1727204160.63670: variable 'interface' from source: set_fact 19665 1727204160.63740: variable 'interface' from source: set_fact 19665 1727204160.63820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204160.64012: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204160.64071: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204160.64120: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204160.64158: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204160.64227: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204160.64284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204160.64329: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.64362: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204160.64431: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204160.64758: variable 'network_connections' from source: play vars 19665 1727204160.64762: variable 'interface' from source: set_fact 19665 1727204160.64834: variable 'interface' from source: set_fact 19665 1727204160.64841: variable 'interface' from source: set_fact 19665 1727204160.64908: variable 'interface' from source: set_fact 19665 1727204160.64945: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19665 1727204160.64949: when evaluation is False, skipping this task 19665 1727204160.64951: _execute() done 19665 1727204160.64954: dumping result to json 19665 1727204160.64956: done dumping result, returning 19665 1727204160.64963: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0dcc-3ea6-00000000001f] 19665 1727204160.64995: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000001f 19665 1727204160.65075: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000001f 19665 1727204160.65077: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19665 1727204160.65138: no more pending results, returning what we have 19665 1727204160.65142: results queue empty 19665 1727204160.65143: checking for any_errors_fatal 19665 1727204160.65151: done checking for any_errors_fatal 19665 1727204160.65151: checking for max_fail_percentage 19665 1727204160.65153: done checking for max_fail_percentage 19665 1727204160.65154: checking to see if all hosts have failed and the running result is not ok 19665 1727204160.65154: done checking to see if all hosts have failed 19665 1727204160.65155: getting the remaining hosts for this loop 19665 1727204160.65157: done getting the remaining hosts for this loop 19665 1727204160.65161: getting the next task for host managed-node3 19665 1727204160.65169: done getting next task for host managed-node3 19665 1727204160.65173: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19665 1727204160.65175: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204160.65187: getting variables 19665 1727204160.65189: in VariableManager get_vars() 19665 1727204160.65227: Calling all_inventory to load vars for managed-node3 19665 1727204160.65229: Calling groups_inventory to load vars for managed-node3 19665 1727204160.65231: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204160.65242: Calling all_plugins_play to load vars for managed-node3 19665 1727204160.65244: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204160.65247: Calling groups_plugins_play to load vars for managed-node3 19665 1727204160.66572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204160.69449: done with get_vars() 19665 1727204160.69480: done getting variables 19665 1727204160.69540: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:56:00 -0400 (0:00:00.115) 0:00:11.568 ***** 19665 1727204160.70155: entering _queue_task() for managed-node3/service 19665 1727204160.70476: worker is 1 (out of 1 available) 19665 1727204160.70490: exiting _queue_task() for managed-node3/service 19665 1727204160.70504: done queuing things up, now waiting for results queue to drain 19665 1727204160.70506: waiting for pending results... 19665 1727204160.71142: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19665 1727204160.71286: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000020 19665 1727204160.71307: variable 'ansible_search_path' from source: unknown 19665 1727204160.71315: variable 'ansible_search_path' from source: unknown 19665 1727204160.71359: calling self._execute() 19665 1727204160.71461: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204160.71476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204160.71496: variable 'omit' from source: magic vars 19665 1727204160.71891: variable 'ansible_distribution_major_version' from source: facts 19665 1727204160.71910: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204160.72080: variable 'network_provider' from source: set_fact 19665 1727204160.72089: variable 'network_state' from source: role '' defaults 19665 1727204160.72103: Evaluated conditional (network_provider == "nm" or network_state != {}): True 19665 1727204160.72113: variable 'omit' from source: magic vars 19665 1727204160.72158: variable 'omit' from source: magic vars 19665 1727204160.72193: variable 'network_service_name' from source: role '' defaults 19665 1727204160.72274: variable 'network_service_name' from source: role '' defaults 19665 1727204160.72386: variable '__network_provider_setup' from source: role '' defaults 19665 1727204160.72398: variable '__network_service_name_default_nm' from source: role '' defaults 19665 1727204160.72467: variable '__network_service_name_default_nm' from source: role '' defaults 19665 1727204160.72482: variable '__network_packages_default_nm' from source: role '' defaults 19665 1727204160.72545: variable '__network_packages_default_nm' from source: role '' defaults 19665 1727204160.72777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204160.75849: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204160.75949: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204160.75997: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204160.76039: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204160.76075: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204160.76152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.76188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.76216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.76263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.76285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.76333: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.76362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.76394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.76445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.76469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.76703: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19665 1727204160.76992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.77118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.77431: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.77532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.77552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.77626: variable 'ansible_python' from source: facts 19665 1727204160.77656: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19665 1727204160.77726: variable '__network_wpa_supplicant_required' from source: role '' defaults 19665 1727204160.78006: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19665 1727204160.78133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.78161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.78193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.78237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.78282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.78457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204160.78559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204160.78615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.78838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204160.78895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204160.79172: variable 'network_connections' from source: play vars 19665 1727204160.79232: variable 'interface' from source: set_fact 19665 1727204160.79428: variable 'interface' from source: set_fact 19665 1727204160.79447: variable 'interface' from source: set_fact 19665 1727204160.79611: variable 'interface' from source: set_fact 19665 1727204160.79814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204160.80255: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204160.80376: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204160.80491: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204160.80570: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204160.80696: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204160.80772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204160.80799: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204160.80846: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204160.80944: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204160.81656: variable 'network_connections' from source: play vars 19665 1727204160.81691: variable 'interface' from source: set_fact 19665 1727204160.81858: variable 'interface' from source: set_fact 19665 1727204160.81874: variable 'interface' from source: set_fact 19665 1727204160.82073: variable 'interface' from source: set_fact 19665 1727204160.82285: variable '__network_packages_default_wireless' from source: role '' defaults 19665 1727204160.82458: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204160.82889: variable 'network_connections' from source: play vars 19665 1727204160.82898: variable 'interface' from source: set_fact 19665 1727204160.82971: variable 'interface' from source: set_fact 19665 1727204160.82982: variable 'interface' from source: set_fact 19665 1727204160.83052: variable 'interface' from source: set_fact 19665 1727204160.83083: variable '__network_packages_default_team' from source: role '' defaults 19665 1727204160.83165: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204160.83451: variable 'network_connections' from source: play vars 19665 1727204160.83731: variable 'interface' from source: set_fact 19665 1727204160.83817: variable 'interface' from source: set_fact 19665 1727204160.83832: variable 'interface' from source: set_fact 19665 1727204160.83926: variable 'interface' from source: set_fact 19665 1727204160.84005: variable '__network_service_name_default_initscripts' from source: role '' defaults 19665 1727204160.84086: variable '__network_service_name_default_initscripts' from source: role '' defaults 19665 1727204160.84093: variable '__network_packages_default_initscripts' from source: role '' defaults 19665 1727204160.84215: variable '__network_packages_default_initscripts' from source: role '' defaults 19665 1727204160.84568: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19665 1727204160.85048: variable 'network_connections' from source: play vars 19665 1727204160.85058: variable 'interface' from source: set_fact 19665 1727204160.85125: variable 'interface' from source: set_fact 19665 1727204160.85136: variable 'interface' from source: set_fact 19665 1727204160.85198: variable 'interface' from source: set_fact 19665 1727204160.85210: variable 'ansible_distribution' from source: facts 19665 1727204160.85219: variable '__network_rh_distros' from source: role '' defaults 19665 1727204160.85226: variable 'ansible_distribution_major_version' from source: facts 19665 1727204160.85260: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19665 1727204160.85452: variable 'ansible_distribution' from source: facts 19665 1727204160.85461: variable '__network_rh_distros' from source: role '' defaults 19665 1727204160.85473: variable 'ansible_distribution_major_version' from source: facts 19665 1727204160.85590: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19665 1727204160.86209: variable 'ansible_distribution' from source: facts 19665 1727204160.86241: variable '__network_rh_distros' from source: role '' defaults 19665 1727204160.86257: variable 'ansible_distribution_major_version' from source: facts 19665 1727204160.86341: variable 'network_provider' from source: set_fact 19665 1727204160.86372: variable 'omit' from source: magic vars 19665 1727204160.86467: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204160.86538: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204160.86577: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204160.86598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204160.86612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204160.86645: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204160.86653: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204160.86660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204160.86904: Set connection var ansible_connection to ssh 19665 1727204160.86917: Set connection var ansible_shell_type to sh 19665 1727204160.86927: Set connection var ansible_timeout to 10 19665 1727204160.86939: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204160.87009: Set connection var ansible_shell_executable to /bin/sh 19665 1727204160.87100: Set connection var ansible_pipelining to False 19665 1727204160.87145: variable 'ansible_shell_executable' from source: unknown 19665 1727204160.87155: variable 'ansible_connection' from source: unknown 19665 1727204160.87222: variable 'ansible_module_compression' from source: unknown 19665 1727204160.87230: variable 'ansible_shell_type' from source: unknown 19665 1727204160.87237: variable 'ansible_shell_executable' from source: unknown 19665 1727204160.87247: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204160.87307: variable 'ansible_pipelining' from source: unknown 19665 1727204160.87318: variable 'ansible_timeout' from source: unknown 19665 1727204160.87377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204160.87510: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204160.87527: variable 'omit' from source: magic vars 19665 1727204160.87540: starting attempt loop 19665 1727204160.87568: running the handler 19665 1727204160.88020: variable 'ansible_facts' from source: unknown 19665 1727204160.90219: _low_level_execute_command(): starting 19665 1727204160.90243: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204160.91241: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204160.91260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204160.91279: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204160.91298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204160.91366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204160.91381: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204160.91397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204160.91417: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204160.91429: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204160.91446: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204160.91460: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204160.91478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204160.91494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204160.91510: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204160.91560: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204160.91578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204160.91729: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204160.91751: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204160.91817: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204160.91924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204160.93559: stdout chunk (state=3): >>>/root <<< 19665 1727204160.93800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204160.93804: stdout chunk (state=3): >>><<< 19665 1727204160.93807: stderr chunk (state=3): >>><<< 19665 1727204160.93945: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204160.93950: _low_level_execute_command(): starting 19665 1727204160.93956: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204160.938348-20599-73218176904520 `" && echo ansible-tmp-1727204160.938348-20599-73218176904520="` echo /root/.ansible/tmp/ansible-tmp-1727204160.938348-20599-73218176904520 `" ) && sleep 0' 19665 1727204160.95232: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204160.95236: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204160.95241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204160.95244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204160.95247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204160.95249: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204160.95251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204160.95253: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204160.95256: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204160.95258: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204160.95260: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204160.95569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204160.95573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204160.95577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204160.95581: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204160.95583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204160.95585: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204160.95587: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204160.95589: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204160.95591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204160.97330: stdout chunk (state=3): >>>ansible-tmp-1727204160.938348-20599-73218176904520=/root/.ansible/tmp/ansible-tmp-1727204160.938348-20599-73218176904520 <<< 19665 1727204160.97492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204160.97496: stdout chunk (state=3): >>><<< 19665 1727204160.97502: stderr chunk (state=3): >>><<< 19665 1727204160.97530: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204160.938348-20599-73218176904520=/root/.ansible/tmp/ansible-tmp-1727204160.938348-20599-73218176904520 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204160.97568: variable 'ansible_module_compression' from source: unknown 19665 1727204160.97630: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 19665 1727204160.97634: ANSIBALLZ: Acquiring lock 19665 1727204160.97637: ANSIBALLZ: Lock acquired: 140619596462752 19665 1727204160.97639: ANSIBALLZ: Creating module 19665 1727204161.40894: ANSIBALLZ: Writing module into payload 19665 1727204161.41345: ANSIBALLZ: Writing module 19665 1727204161.41399: ANSIBALLZ: Renaming module 19665 1727204161.41458: ANSIBALLZ: Done creating module 19665 1727204161.41512: variable 'ansible_facts' from source: unknown 19665 1727204161.41759: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204160.938348-20599-73218176904520/AnsiballZ_systemd.py 19665 1727204161.41929: Sending initial data 19665 1727204161.41932: Sent initial data (154 bytes) 19665 1727204161.43652: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204161.43660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204161.43673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204161.43691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204161.43817: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204161.43821: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204161.43823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204161.43826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204161.43828: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204161.43830: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204161.43831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204161.43833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204161.44534: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204161.44541: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204161.44549: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204161.44559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204161.44751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204161.44773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204161.44786: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204161.44876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204161.46701: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204161.46741: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204161.46783: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpfemoo4bh /root/.ansible/tmp/ansible-tmp-1727204160.938348-20599-73218176904520/AnsiballZ_systemd.py <<< 19665 1727204161.46831: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204161.49989: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204161.50255: stderr chunk (state=3): >>><<< 19665 1727204161.50259: stdout chunk (state=3): >>><<< 19665 1727204161.50261: done transferring module to remote 19665 1727204161.50273: _low_level_execute_command(): starting 19665 1727204161.50277: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204160.938348-20599-73218176904520/ /root/.ansible/tmp/ansible-tmp-1727204160.938348-20599-73218176904520/AnsiballZ_systemd.py && sleep 0' 19665 1727204161.51757: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204161.51814: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204161.51830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204161.51852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204161.51943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204161.51957: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204161.51975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204161.51994: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204161.52026: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204161.52041: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204161.52055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204161.52073: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204161.52089: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204161.52135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204161.52150: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204161.52167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204161.52361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204161.52380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204161.52395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204161.52575: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204161.54419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204161.54425: stdout chunk (state=3): >>><<< 19665 1727204161.54428: stderr chunk (state=3): >>><<< 19665 1727204161.54470: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204161.54479: _low_level_execute_command(): starting 19665 1727204161.54482: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204160.938348-20599-73218176904520/AnsiballZ_systemd.py && sleep 0' 19665 1727204161.56104: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204161.56120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204161.56139: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204161.56159: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204161.56210: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204161.56279: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204161.56297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204161.56314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204161.56328: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204161.56342: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204161.56354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204161.56368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204161.56383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204161.56397: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204161.56407: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204161.56420: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204161.56539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204161.56629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204161.56648: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204161.56846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204161.81898: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.<<< 19665 1727204161.81958: stdout chunk (state=3): >>>service", "ControlGroupId": "2418", "MemoryCurrent": "16101376", "MemoryAvailable": "infinity", "CPUUsageNSec": "1359390000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 19665 1727204161.83589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204161.83596: stdout chunk (state=3): >>><<< 19665 1727204161.83599: stderr chunk (state=3): >>><<< 19665 1727204161.83990: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "16101376", "MemoryAvailable": "infinity", "CPUUsageNSec": "1359390000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204161.84006: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204160.938348-20599-73218176904520/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204161.84010: _low_level_execute_command(): starting 19665 1727204161.84016: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204160.938348-20599-73218176904520/ > /dev/null 2>&1 && sleep 0' 19665 1727204161.85272: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204161.85288: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204161.85303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204161.85349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204161.85460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204161.85487: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204161.85516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204161.85548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204161.85561: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204161.85575: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204161.85585: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204161.85596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204161.85622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204161.85648: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204161.85658: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204161.85672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204161.85750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204161.85768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204161.85781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204161.85913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204161.87775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204161.87779: stdout chunk (state=3): >>><<< 19665 1727204161.87782: stderr chunk (state=3): >>><<< 19665 1727204161.87978: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204161.87982: handler run complete 19665 1727204161.87984: attempt loop complete, returning result 19665 1727204161.87987: _execute() done 19665 1727204161.87989: dumping result to json 19665 1727204161.87991: done dumping result, returning 19665 1727204161.87993: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-0dcc-3ea6-000000000020] 19665 1727204161.87995: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000020 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204161.88690: no more pending results, returning what we have 19665 1727204161.88693: results queue empty 19665 1727204161.88694: checking for any_errors_fatal 19665 1727204161.88704: done checking for any_errors_fatal 19665 1727204161.88705: checking for max_fail_percentage 19665 1727204161.88707: done checking for max_fail_percentage 19665 1727204161.88708: checking to see if all hosts have failed and the running result is not ok 19665 1727204161.88710: done checking to see if all hosts have failed 19665 1727204161.88711: getting the remaining hosts for this loop 19665 1727204161.88713: done getting the remaining hosts for this loop 19665 1727204161.88717: getting the next task for host managed-node3 19665 1727204161.88725: done getting next task for host managed-node3 19665 1727204161.88729: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19665 1727204161.88733: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204161.88744: getting variables 19665 1727204161.88746: in VariableManager get_vars() 19665 1727204161.88794: Calling all_inventory to load vars for managed-node3 19665 1727204161.88797: Calling groups_inventory to load vars for managed-node3 19665 1727204161.88799: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204161.88813: Calling all_plugins_play to load vars for managed-node3 19665 1727204161.88818: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204161.88822: Calling groups_plugins_play to load vars for managed-node3 19665 1727204161.89345: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000020 19665 1727204161.89349: WORKER PROCESS EXITING 19665 1727204161.89772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204161.90910: done with get_vars() 19665 1727204161.90951: done getting variables 19665 1727204161.91034: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:56:01 -0400 (0:00:01.209) 0:00:12.777 ***** 19665 1727204161.91091: entering _queue_task() for managed-node3/service 19665 1727204161.91397: worker is 1 (out of 1 available) 19665 1727204161.91416: exiting _queue_task() for managed-node3/service 19665 1727204161.91434: done queuing things up, now waiting for results queue to drain 19665 1727204161.91436: waiting for pending results... 19665 1727204161.91732: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19665 1727204161.92132: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000021 19665 1727204161.92136: variable 'ansible_search_path' from source: unknown 19665 1727204161.92139: variable 'ansible_search_path' from source: unknown 19665 1727204161.92141: calling self._execute() 19665 1727204161.92144: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204161.92147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204161.92150: variable 'omit' from source: magic vars 19665 1727204161.92509: variable 'ansible_distribution_major_version' from source: facts 19665 1727204161.92513: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204161.92516: variable 'network_provider' from source: set_fact 19665 1727204161.92518: Evaluated conditional (network_provider == "nm"): True 19665 1727204161.92907: variable '__network_wpa_supplicant_required' from source: role '' defaults 19665 1727204161.92912: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19665 1727204161.93747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204161.97350: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204161.97619: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204161.97657: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204161.97690: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204161.97714: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204161.98005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204161.98083: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204161.98086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204161.98135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204161.98169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204161.98245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204161.98298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204161.98347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204161.98416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204161.98454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204161.98502: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204161.98526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204161.98555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204161.98599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204161.98617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204161.98832: variable 'network_connections' from source: play vars 19665 1727204161.98852: variable 'interface' from source: set_fact 19665 1727204161.98946: variable 'interface' from source: set_fact 19665 1727204161.98959: variable 'interface' from source: set_fact 19665 1727204161.99033: variable 'interface' from source: set_fact 19665 1727204161.99128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204162.00001: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204162.00046: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204162.00290: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204162.00327: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204162.00376: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204162.00404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204162.00449: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204162.00481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204162.00546: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204162.01432: variable 'network_connections' from source: play vars 19665 1727204162.01435: variable 'interface' from source: set_fact 19665 1727204162.01441: variable 'interface' from source: set_fact 19665 1727204162.01443: variable 'interface' from source: set_fact 19665 1727204162.01445: variable 'interface' from source: set_fact 19665 1727204162.01482: Evaluated conditional (__network_wpa_supplicant_required): False 19665 1727204162.01485: when evaluation is False, skipping this task 19665 1727204162.01488: _execute() done 19665 1727204162.01500: dumping result to json 19665 1727204162.01503: done dumping result, returning 19665 1727204162.01505: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-0dcc-3ea6-000000000021] 19665 1727204162.01508: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000021 19665 1727204162.01614: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000021 19665 1727204162.01618: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 19665 1727204162.01671: no more pending results, returning what we have 19665 1727204162.01675: results queue empty 19665 1727204162.01677: checking for any_errors_fatal 19665 1727204162.01698: done checking for any_errors_fatal 19665 1727204162.01699: checking for max_fail_percentage 19665 1727204162.01701: done checking for max_fail_percentage 19665 1727204162.01702: checking to see if all hosts have failed and the running result is not ok 19665 1727204162.01703: done checking to see if all hosts have failed 19665 1727204162.01704: getting the remaining hosts for this loop 19665 1727204162.01706: done getting the remaining hosts for this loop 19665 1727204162.01710: getting the next task for host managed-node3 19665 1727204162.01718: done getting next task for host managed-node3 19665 1727204162.01723: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 19665 1727204162.01725: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204162.01740: getting variables 19665 1727204162.01742: in VariableManager get_vars() 19665 1727204162.01781: Calling all_inventory to load vars for managed-node3 19665 1727204162.01783: Calling groups_inventory to load vars for managed-node3 19665 1727204162.01785: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204162.01795: Calling all_plugins_play to load vars for managed-node3 19665 1727204162.01798: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204162.01800: Calling groups_plugins_play to load vars for managed-node3 19665 1727204162.04453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204162.08128: done with get_vars() 19665 1727204162.08430: done getting variables 19665 1727204162.08496: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:56:02 -0400 (0:00:00.174) 0:00:12.951 ***** 19665 1727204162.08528: entering _queue_task() for managed-node3/service 19665 1727204162.08866: worker is 1 (out of 1 available) 19665 1727204162.08878: exiting _queue_task() for managed-node3/service 19665 1727204162.08891: done queuing things up, now waiting for results queue to drain 19665 1727204162.08893: waiting for pending results... 19665 1727204162.09170: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 19665 1727204162.09292: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000022 19665 1727204162.09313: variable 'ansible_search_path' from source: unknown 19665 1727204162.09321: variable 'ansible_search_path' from source: unknown 19665 1727204162.09374: calling self._execute() 19665 1727204162.09478: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204162.09489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204162.09504: variable 'omit' from source: magic vars 19665 1727204162.10161: variable 'ansible_distribution_major_version' from source: facts 19665 1727204162.10224: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204162.10465: variable 'network_provider' from source: set_fact 19665 1727204162.10478: Evaluated conditional (network_provider == "initscripts"): False 19665 1727204162.10486: when evaluation is False, skipping this task 19665 1727204162.10493: _execute() done 19665 1727204162.10500: dumping result to json 19665 1727204162.10508: done dumping result, returning 19665 1727204162.10519: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-0dcc-3ea6-000000000022] 19665 1727204162.10536: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000022 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204162.10700: no more pending results, returning what we have 19665 1727204162.10704: results queue empty 19665 1727204162.10706: checking for any_errors_fatal 19665 1727204162.10716: done checking for any_errors_fatal 19665 1727204162.10717: checking for max_fail_percentage 19665 1727204162.10719: done checking for max_fail_percentage 19665 1727204162.10720: checking to see if all hosts have failed and the running result is not ok 19665 1727204162.10721: done checking to see if all hosts have failed 19665 1727204162.10722: getting the remaining hosts for this loop 19665 1727204162.10724: done getting the remaining hosts for this loop 19665 1727204162.10728: getting the next task for host managed-node3 19665 1727204162.10736: done getting next task for host managed-node3 19665 1727204162.10744: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19665 1727204162.10747: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204162.10762: getting variables 19665 1727204162.10767: in VariableManager get_vars() 19665 1727204162.10808: Calling all_inventory to load vars for managed-node3 19665 1727204162.10810: Calling groups_inventory to load vars for managed-node3 19665 1727204162.10813: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204162.10828: Calling all_plugins_play to load vars for managed-node3 19665 1727204162.10831: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204162.10835: Calling groups_plugins_play to load vars for managed-node3 19665 1727204162.11879: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000022 19665 1727204162.11883: WORKER PROCESS EXITING 19665 1727204162.12619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204162.14474: done with get_vars() 19665 1727204162.14506: done getting variables 19665 1727204162.14567: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:56:02 -0400 (0:00:00.060) 0:00:13.012 ***** 19665 1727204162.14595: entering _queue_task() for managed-node3/copy 19665 1727204162.14933: worker is 1 (out of 1 available) 19665 1727204162.14947: exiting _queue_task() for managed-node3/copy 19665 1727204162.14960: done queuing things up, now waiting for results queue to drain 19665 1727204162.14962: waiting for pending results... 19665 1727204162.15251: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19665 1727204162.15360: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000023 19665 1727204162.15382: variable 'ansible_search_path' from source: unknown 19665 1727204162.15390: variable 'ansible_search_path' from source: unknown 19665 1727204162.15431: calling self._execute() 19665 1727204162.15535: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204162.15550: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204162.15567: variable 'omit' from source: magic vars 19665 1727204162.16203: variable 'ansible_distribution_major_version' from source: facts 19665 1727204162.16221: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204162.16448: variable 'network_provider' from source: set_fact 19665 1727204162.16505: Evaluated conditional (network_provider == "initscripts"): False 19665 1727204162.16541: when evaluation is False, skipping this task 19665 1727204162.16549: _execute() done 19665 1727204162.16556: dumping result to json 19665 1727204162.16575: done dumping result, returning 19665 1727204162.16620: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-0dcc-3ea6-000000000023] 19665 1727204162.16633: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000023 19665 1727204162.16812: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000023 skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 19665 1727204162.16870: no more pending results, returning what we have 19665 1727204162.16874: results queue empty 19665 1727204162.16876: checking for any_errors_fatal 19665 1727204162.16883: done checking for any_errors_fatal 19665 1727204162.16884: checking for max_fail_percentage 19665 1727204162.16886: done checking for max_fail_percentage 19665 1727204162.16887: checking to see if all hosts have failed and the running result is not ok 19665 1727204162.16887: done checking to see if all hosts have failed 19665 1727204162.16888: getting the remaining hosts for this loop 19665 1727204162.16890: done getting the remaining hosts for this loop 19665 1727204162.16895: getting the next task for host managed-node3 19665 1727204162.16902: done getting next task for host managed-node3 19665 1727204162.16907: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19665 1727204162.16909: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204162.16924: getting variables 19665 1727204162.16927: in VariableManager get_vars() 19665 1727204162.16972: Calling all_inventory to load vars for managed-node3 19665 1727204162.16975: Calling groups_inventory to load vars for managed-node3 19665 1727204162.16978: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204162.16991: Calling all_plugins_play to load vars for managed-node3 19665 1727204162.16994: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204162.16998: Calling groups_plugins_play to load vars for managed-node3 19665 1727204162.18084: WORKER PROCESS EXITING 19665 1727204162.20539: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204162.24421: done with get_vars() 19665 1727204162.24461: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:56:02 -0400 (0:00:00.100) 0:00:13.113 ***** 19665 1727204162.24680: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 19665 1727204162.24682: Creating lock for fedora.linux_system_roles.network_connections 19665 1727204162.25525: worker is 1 (out of 1 available) 19665 1727204162.25538: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 19665 1727204162.25552: done queuing things up, now waiting for results queue to drain 19665 1727204162.25554: waiting for pending results... 19665 1727204162.25934: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19665 1727204162.26051: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000024 19665 1727204162.26075: variable 'ansible_search_path' from source: unknown 19665 1727204162.26084: variable 'ansible_search_path' from source: unknown 19665 1727204162.26125: calling self._execute() 19665 1727204162.26225: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204162.26240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204162.26256: variable 'omit' from source: magic vars 19665 1727204162.26705: variable 'ansible_distribution_major_version' from source: facts 19665 1727204162.26726: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204162.26739: variable 'omit' from source: magic vars 19665 1727204162.26785: variable 'omit' from source: magic vars 19665 1727204162.26947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204162.30275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204162.30352: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204162.30406: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204162.30452: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204162.30487: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204162.30581: variable 'network_provider' from source: set_fact 19665 1727204162.30790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204162.30869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204162.30929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204162.30981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204162.31001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204162.31085: variable 'omit' from source: magic vars 19665 1727204162.31212: variable 'omit' from source: magic vars 19665 1727204162.31322: variable 'network_connections' from source: play vars 19665 1727204162.31341: variable 'interface' from source: set_fact 19665 1727204162.31417: variable 'interface' from source: set_fact 19665 1727204162.31431: variable 'interface' from source: set_fact 19665 1727204162.31656: variable 'interface' from source: set_fact 19665 1727204162.31823: variable 'omit' from source: magic vars 19665 1727204162.31840: variable '__lsr_ansible_managed' from source: task vars 19665 1727204162.31906: variable '__lsr_ansible_managed' from source: task vars 19665 1727204162.32302: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 19665 1727204162.32739: Loaded config def from plugin (lookup/template) 19665 1727204162.32750: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 19665 1727204162.32785: File lookup term: get_ansible_managed.j2 19665 1727204162.32865: variable 'ansible_search_path' from source: unknown 19665 1727204162.32878: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 19665 1727204162.32897: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 19665 1727204162.32918: variable 'ansible_search_path' from source: unknown 19665 1727204162.41168: variable 'ansible_managed' from source: unknown 19665 1727204162.41317: variable 'omit' from source: magic vars 19665 1727204162.41357: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204162.41392: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204162.41416: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204162.41442: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204162.41457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204162.41492: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204162.41502: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204162.41510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204162.41605: Set connection var ansible_connection to ssh 19665 1727204162.41619: Set connection var ansible_shell_type to sh 19665 1727204162.41631: Set connection var ansible_timeout to 10 19665 1727204162.41645: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204162.41658: Set connection var ansible_shell_executable to /bin/sh 19665 1727204162.41674: Set connection var ansible_pipelining to False 19665 1727204162.41702: variable 'ansible_shell_executable' from source: unknown 19665 1727204162.41710: variable 'ansible_connection' from source: unknown 19665 1727204162.41717: variable 'ansible_module_compression' from source: unknown 19665 1727204162.41725: variable 'ansible_shell_type' from source: unknown 19665 1727204162.41733: variable 'ansible_shell_executable' from source: unknown 19665 1727204162.41744: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204162.41752: variable 'ansible_pipelining' from source: unknown 19665 1727204162.41759: variable 'ansible_timeout' from source: unknown 19665 1727204162.41769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204162.41905: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204162.41929: variable 'omit' from source: magic vars 19665 1727204162.41944: starting attempt loop 19665 1727204162.41952: running the handler 19665 1727204162.41972: _low_level_execute_command(): starting 19665 1727204162.41985: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204162.42730: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204162.42751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204162.42771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204162.42790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204162.42835: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204162.42851: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204162.42868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204162.42887: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204162.42900: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204162.42912: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204162.42925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204162.42943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204162.42960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204162.42981: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204162.42993: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204162.43007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204162.43087: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204162.43112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204162.43129: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204162.43216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204162.44851: stdout chunk (state=3): >>>/root <<< 19665 1727204162.45026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204162.45030: stderr chunk (state=3): >>><<< 19665 1727204162.45035: stdout chunk (state=3): >>><<< 19665 1727204162.45074: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204162.45087: _low_level_execute_command(): starting 19665 1727204162.45093: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204162.4507415-20742-202289508628768 `" && echo ansible-tmp-1727204162.4507415-20742-202289508628768="` echo /root/.ansible/tmp/ansible-tmp-1727204162.4507415-20742-202289508628768 `" ) && sleep 0' 19665 1727204162.46401: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204162.46933: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204162.46940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204162.46962: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204162.47035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204162.48903: stdout chunk (state=3): >>>ansible-tmp-1727204162.4507415-20742-202289508628768=/root/.ansible/tmp/ansible-tmp-1727204162.4507415-20742-202289508628768 <<< 19665 1727204162.49071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204162.49075: stderr chunk (state=3): >>><<< 19665 1727204162.49078: stdout chunk (state=3): >>><<< 19665 1727204162.49098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204162.4507415-20742-202289508628768=/root/.ansible/tmp/ansible-tmp-1727204162.4507415-20742-202289508628768 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204162.49146: variable 'ansible_module_compression' from source: unknown 19665 1727204162.49192: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 19665 1727204162.49195: ANSIBALLZ: Acquiring lock 19665 1727204162.49198: ANSIBALLZ: Lock acquired: 140619630201936 19665 1727204162.49200: ANSIBALLZ: Creating module 19665 1727204162.98975: ANSIBALLZ: Writing module into payload 19665 1727204162.99588: ANSIBALLZ: Writing module 19665 1727204162.99619: ANSIBALLZ: Renaming module 19665 1727204162.99624: ANSIBALLZ: Done creating module 19665 1727204162.99651: variable 'ansible_facts' from source: unknown 19665 1727204162.99840: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204162.4507415-20742-202289508628768/AnsiballZ_network_connections.py 19665 1727204163.00254: Sending initial data 19665 1727204163.00258: Sent initial data (168 bytes) 19665 1727204163.01541: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204163.01546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204163.01586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204163.01610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204163.01659: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204163.01676: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204163.01692: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204163.01710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204163.01722: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204163.01733: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204163.01747: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204163.01761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204163.01782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204163.01797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204163.01809: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204163.01824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204163.01906: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204163.01923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204163.01938: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204163.02134: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204163.03944: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204163.03949: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204163.03998: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpmxf572jl /root/.ansible/tmp/ansible-tmp-1727204162.4507415-20742-202289508628768/AnsiballZ_network_connections.py <<< 19665 1727204163.04054: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204163.06499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204163.06842: stderr chunk (state=3): >>><<< 19665 1727204163.06846: stdout chunk (state=3): >>><<< 19665 1727204163.06849: done transferring module to remote 19665 1727204163.06851: _low_level_execute_command(): starting 19665 1727204163.06853: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204162.4507415-20742-202289508628768/ /root/.ansible/tmp/ansible-tmp-1727204162.4507415-20742-202289508628768/AnsiballZ_network_connections.py && sleep 0' 19665 1727204163.08377: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204163.08453: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204163.08475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204163.08500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204163.08554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204163.08575: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204163.08627: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204163.08649: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204163.08663: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204163.08678: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204163.08691: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204163.08706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204163.08729: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204163.08772: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204163.08785: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204163.08800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204163.08895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204163.08929: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204163.08952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204163.09030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204163.10956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204163.10961: stdout chunk (state=3): >>><<< 19665 1727204163.10963: stderr chunk (state=3): >>><<< 19665 1727204163.11091: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204163.11098: _low_level_execute_command(): starting 19665 1727204163.11101: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204162.4507415-20742-202289508628768/AnsiballZ_network_connections.py && sleep 0' 19665 1727204163.12203: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204163.12212: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204163.12223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204163.12236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204163.12340: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204163.12349: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204163.12359: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204163.12376: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204163.12384: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204163.12390: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204163.12399: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204163.12409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204163.12427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204163.12436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204163.12445: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204163.12453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204163.12530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204163.12555: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204163.12569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204163.12646: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204163.40104: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, fe68f071-1086-45ef-92de-86b998c54595\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, fe68f071-1086-45ef-92de-86b998c54595 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 19665 1727204163.42189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204163.42193: stdout chunk (state=3): >>><<< 19665 1727204163.42199: stderr chunk (state=3): >>><<< 19665 1727204163.42235: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, fe68f071-1086-45ef-92de-86b998c54595\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, fe68f071-1086-45ef-92de-86b998c54595 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "interface_name": "LSR-TST-br31", "state": "up", "type": "bridge", "ip": {"dhcp4": false, "auto6": true}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204163.42296: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'interface_name': 'LSR-TST-br31', 'state': 'up', 'type': 'bridge', 'ip': {'dhcp4': False, 'auto6': True}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204162.4507415-20742-202289508628768/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204163.42299: _low_level_execute_command(): starting 19665 1727204163.42304: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204162.4507415-20742-202289508628768/ > /dev/null 2>&1 && sleep 0' 19665 1727204163.43074: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204163.43083: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204163.43100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204163.43113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204163.43159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204163.43163: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204163.43178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204163.43191: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204163.43201: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204163.43208: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204163.43218: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204163.43232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204163.43247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204163.43255: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204163.43262: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204163.43273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204163.43353: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204163.43374: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204163.43388: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204163.43461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204163.45468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204163.45472: stdout chunk (state=3): >>><<< 19665 1727204163.45475: stderr chunk (state=3): >>><<< 19665 1727204163.45494: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204163.45497: handler run complete 19665 1727204163.45532: attempt loop complete, returning result 19665 1727204163.45536: _execute() done 19665 1727204163.45538: dumping result to json 19665 1727204163.45546: done dumping result, returning 19665 1727204163.45557: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-0dcc-3ea6-000000000024] 19665 1727204163.45560: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000024 19665 1727204163.45721: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000024 19665 1727204163.45724: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, fe68f071-1086-45ef-92de-86b998c54595 [004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, fe68f071-1086-45ef-92de-86b998c54595 (not-active) 19665 1727204163.45838: no more pending results, returning what we have 19665 1727204163.45842: results queue empty 19665 1727204163.45844: checking for any_errors_fatal 19665 1727204163.45854: done checking for any_errors_fatal 19665 1727204163.45854: checking for max_fail_percentage 19665 1727204163.45856: done checking for max_fail_percentage 19665 1727204163.45857: checking to see if all hosts have failed and the running result is not ok 19665 1727204163.45857: done checking to see if all hosts have failed 19665 1727204163.45858: getting the remaining hosts for this loop 19665 1727204163.45860: done getting the remaining hosts for this loop 19665 1727204163.45865: getting the next task for host managed-node3 19665 1727204163.45871: done getting next task for host managed-node3 19665 1727204163.45875: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 19665 1727204163.45876: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204163.45886: getting variables 19665 1727204163.45887: in VariableManager get_vars() 19665 1727204163.45921: Calling all_inventory to load vars for managed-node3 19665 1727204163.45924: Calling groups_inventory to load vars for managed-node3 19665 1727204163.45926: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204163.45935: Calling all_plugins_play to load vars for managed-node3 19665 1727204163.45937: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204163.45940: Calling groups_plugins_play to load vars for managed-node3 19665 1727204163.47439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204163.51944: done with get_vars() 19665 1727204163.52128: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:56:03 -0400 (0:00:01.276) 0:00:14.390 ***** 19665 1727204163.52342: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 19665 1727204163.52344: Creating lock for fedora.linux_system_roles.network_state 19665 1727204163.53049: worker is 1 (out of 1 available) 19665 1727204163.53060: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 19665 1727204163.53075: done queuing things up, now waiting for results queue to drain 19665 1727204163.53077: waiting for pending results... 19665 1727204163.54246: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 19665 1727204163.54329: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000025 19665 1727204163.54346: variable 'ansible_search_path' from source: unknown 19665 1727204163.54349: variable 'ansible_search_path' from source: unknown 19665 1727204163.54388: calling self._execute() 19665 1727204163.54489: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204163.54493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204163.54508: variable 'omit' from source: magic vars 19665 1727204163.55603: variable 'ansible_distribution_major_version' from source: facts 19665 1727204163.55616: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204163.55753: variable 'network_state' from source: role '' defaults 19665 1727204163.55766: Evaluated conditional (network_state != {}): False 19665 1727204163.55769: when evaluation is False, skipping this task 19665 1727204163.55773: _execute() done 19665 1727204163.55775: dumping result to json 19665 1727204163.55778: done dumping result, returning 19665 1727204163.55786: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-0dcc-3ea6-000000000025] 19665 1727204163.55792: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000025 19665 1727204163.55886: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000025 19665 1727204163.55889: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204163.55970: no more pending results, returning what we have 19665 1727204163.55974: results queue empty 19665 1727204163.55975: checking for any_errors_fatal 19665 1727204163.55989: done checking for any_errors_fatal 19665 1727204163.55990: checking for max_fail_percentage 19665 1727204163.55992: done checking for max_fail_percentage 19665 1727204163.55993: checking to see if all hosts have failed and the running result is not ok 19665 1727204163.55993: done checking to see if all hosts have failed 19665 1727204163.55994: getting the remaining hosts for this loop 19665 1727204163.55996: done getting the remaining hosts for this loop 19665 1727204163.56001: getting the next task for host managed-node3 19665 1727204163.56007: done getting next task for host managed-node3 19665 1727204163.56012: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19665 1727204163.56014: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204163.56030: getting variables 19665 1727204163.56032: in VariableManager get_vars() 19665 1727204163.56067: Calling all_inventory to load vars for managed-node3 19665 1727204163.56069: Calling groups_inventory to load vars for managed-node3 19665 1727204163.56071: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204163.56081: Calling all_plugins_play to load vars for managed-node3 19665 1727204163.56083: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204163.56085: Calling groups_plugins_play to load vars for managed-node3 19665 1727204163.58921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204163.62690: done with get_vars() 19665 1727204163.62793: done getting variables 19665 1727204163.62981: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:56:03 -0400 (0:00:00.106) 0:00:14.496 ***** 19665 1727204163.63013: entering _queue_task() for managed-node3/debug 19665 1727204163.63810: worker is 1 (out of 1 available) 19665 1727204163.63823: exiting _queue_task() for managed-node3/debug 19665 1727204163.63836: done queuing things up, now waiting for results queue to drain 19665 1727204163.63837: waiting for pending results... 19665 1727204163.64339: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19665 1727204163.64434: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000026 19665 1727204163.64455: variable 'ansible_search_path' from source: unknown 19665 1727204163.64458: variable 'ansible_search_path' from source: unknown 19665 1727204163.64495: calling self._execute() 19665 1727204163.64586: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204163.64592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204163.64601: variable 'omit' from source: magic vars 19665 1727204163.66215: variable 'ansible_distribution_major_version' from source: facts 19665 1727204163.66227: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204163.66233: variable 'omit' from source: magic vars 19665 1727204163.66280: variable 'omit' from source: magic vars 19665 1727204163.66319: variable 'omit' from source: magic vars 19665 1727204163.66367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204163.66402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204163.66425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204163.66446: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204163.66457: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204163.66491: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204163.66495: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204163.66498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204163.66602: Set connection var ansible_connection to ssh 19665 1727204163.66610: Set connection var ansible_shell_type to sh 19665 1727204163.66616: Set connection var ansible_timeout to 10 19665 1727204163.66622: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204163.66630: Set connection var ansible_shell_executable to /bin/sh 19665 1727204163.66638: Set connection var ansible_pipelining to False 19665 1727204163.66667: variable 'ansible_shell_executable' from source: unknown 19665 1727204163.66671: variable 'ansible_connection' from source: unknown 19665 1727204163.66674: variable 'ansible_module_compression' from source: unknown 19665 1727204163.66676: variable 'ansible_shell_type' from source: unknown 19665 1727204163.66678: variable 'ansible_shell_executable' from source: unknown 19665 1727204163.66681: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204163.66683: variable 'ansible_pipelining' from source: unknown 19665 1727204163.66685: variable 'ansible_timeout' from source: unknown 19665 1727204163.66689: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204163.67545: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204163.67556: variable 'omit' from source: magic vars 19665 1727204163.67562: starting attempt loop 19665 1727204163.67568: running the handler 19665 1727204163.67704: variable '__network_connections_result' from source: set_fact 19665 1727204163.67762: handler run complete 19665 1727204163.67780: attempt loop complete, returning result 19665 1727204163.67783: _execute() done 19665 1727204163.67786: dumping result to json 19665 1727204163.67788: done dumping result, returning 19665 1727204163.67798: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-0dcc-3ea6-000000000026] 19665 1727204163.67803: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000026 19665 1727204163.67897: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000026 19665 1727204163.67900: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, fe68f071-1086-45ef-92de-86b998c54595", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, fe68f071-1086-45ef-92de-86b998c54595 (not-active)" ] } 19665 1727204163.67968: no more pending results, returning what we have 19665 1727204163.67971: results queue empty 19665 1727204163.67972: checking for any_errors_fatal 19665 1727204163.67981: done checking for any_errors_fatal 19665 1727204163.67982: checking for max_fail_percentage 19665 1727204163.67983: done checking for max_fail_percentage 19665 1727204163.67984: checking to see if all hosts have failed and the running result is not ok 19665 1727204163.67985: done checking to see if all hosts have failed 19665 1727204163.67985: getting the remaining hosts for this loop 19665 1727204163.67987: done getting the remaining hosts for this loop 19665 1727204163.67992: getting the next task for host managed-node3 19665 1727204163.67998: done getting next task for host managed-node3 19665 1727204163.68002: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19665 1727204163.68004: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204163.68015: getting variables 19665 1727204163.68017: in VariableManager get_vars() 19665 1727204163.68055: Calling all_inventory to load vars for managed-node3 19665 1727204163.68058: Calling groups_inventory to load vars for managed-node3 19665 1727204163.68060: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204163.68071: Calling all_plugins_play to load vars for managed-node3 19665 1727204163.68073: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204163.68076: Calling groups_plugins_play to load vars for managed-node3 19665 1727204163.70610: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204163.74692: done with get_vars() 19665 1727204163.74722: done getting variables 19665 1727204163.74907: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:56:03 -0400 (0:00:00.119) 0:00:14.616 ***** 19665 1727204163.74939: entering _queue_task() for managed-node3/debug 19665 1727204163.75781: worker is 1 (out of 1 available) 19665 1727204163.75795: exiting _queue_task() for managed-node3/debug 19665 1727204163.75807: done queuing things up, now waiting for results queue to drain 19665 1727204163.75809: waiting for pending results... 19665 1727204163.76745: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19665 1727204163.76835: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000027 19665 1727204163.76851: variable 'ansible_search_path' from source: unknown 19665 1727204163.76855: variable 'ansible_search_path' from source: unknown 19665 1727204163.76891: calling self._execute() 19665 1727204163.76979: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204163.76982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204163.76991: variable 'omit' from source: magic vars 19665 1727204163.78082: variable 'ansible_distribution_major_version' from source: facts 19665 1727204163.78093: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204163.78099: variable 'omit' from source: magic vars 19665 1727204163.78140: variable 'omit' from source: magic vars 19665 1727204163.78180: variable 'omit' from source: magic vars 19665 1727204163.78220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204163.78258: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204163.78285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204163.78297: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204163.78309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204163.78339: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204163.78346: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204163.78349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204163.78448: Set connection var ansible_connection to ssh 19665 1727204163.78456: Set connection var ansible_shell_type to sh 19665 1727204163.78462: Set connection var ansible_timeout to 10 19665 1727204163.78470: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204163.78478: Set connection var ansible_shell_executable to /bin/sh 19665 1727204163.78487: Set connection var ansible_pipelining to False 19665 1727204163.78510: variable 'ansible_shell_executable' from source: unknown 19665 1727204163.78513: variable 'ansible_connection' from source: unknown 19665 1727204163.78516: variable 'ansible_module_compression' from source: unknown 19665 1727204163.78518: variable 'ansible_shell_type' from source: unknown 19665 1727204163.78521: variable 'ansible_shell_executable' from source: unknown 19665 1727204163.78523: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204163.78525: variable 'ansible_pipelining' from source: unknown 19665 1727204163.78527: variable 'ansible_timeout' from source: unknown 19665 1727204163.78532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204163.78978: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204163.78990: variable 'omit' from source: magic vars 19665 1727204163.78995: starting attempt loop 19665 1727204163.78998: running the handler 19665 1727204163.79046: variable '__network_connections_result' from source: set_fact 19665 1727204163.79129: variable '__network_connections_result' from source: set_fact 19665 1727204163.79254: handler run complete 19665 1727204163.79280: attempt loop complete, returning result 19665 1727204163.79284: _execute() done 19665 1727204163.79287: dumping result to json 19665 1727204163.79289: done dumping result, returning 19665 1727204163.79298: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-0dcc-3ea6-000000000027] 19665 1727204163.79304: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000027 19665 1727204163.79405: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000027 19665 1727204163.79408: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "interface_name": "LSR-TST-br31", "ip": { "auto6": true, "dhcp4": false }, "name": "LSR-TST-br31", "state": "up", "type": "bridge" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, fe68f071-1086-45ef-92de-86b998c54595\n[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, fe68f071-1086-45ef-92de-86b998c54595 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'LSR-TST-br31': add connection LSR-TST-br31, fe68f071-1086-45ef-92de-86b998c54595", "[004] #0, state:up persistent_state:present, 'LSR-TST-br31': up connection LSR-TST-br31, fe68f071-1086-45ef-92de-86b998c54595 (not-active)" ] } } 19665 1727204163.79493: no more pending results, returning what we have 19665 1727204163.79496: results queue empty 19665 1727204163.79497: checking for any_errors_fatal 19665 1727204163.79506: done checking for any_errors_fatal 19665 1727204163.79506: checking for max_fail_percentage 19665 1727204163.79508: done checking for max_fail_percentage 19665 1727204163.79509: checking to see if all hosts have failed and the running result is not ok 19665 1727204163.79510: done checking to see if all hosts have failed 19665 1727204163.79511: getting the remaining hosts for this loop 19665 1727204163.79513: done getting the remaining hosts for this loop 19665 1727204163.79518: getting the next task for host managed-node3 19665 1727204163.79523: done getting next task for host managed-node3 19665 1727204163.79527: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19665 1727204163.79529: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204163.79538: getting variables 19665 1727204163.79540: in VariableManager get_vars() 19665 1727204163.79576: Calling all_inventory to load vars for managed-node3 19665 1727204163.79579: Calling groups_inventory to load vars for managed-node3 19665 1727204163.79581: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204163.79591: Calling all_plugins_play to load vars for managed-node3 19665 1727204163.79594: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204163.79596: Calling groups_plugins_play to load vars for managed-node3 19665 1727204163.82177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204163.85881: done with get_vars() 19665 1727204163.85916: done getting variables 19665 1727204163.86108: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:56:03 -0400 (0:00:00.111) 0:00:14.728 ***** 19665 1727204163.86141: entering _queue_task() for managed-node3/debug 19665 1727204163.86918: worker is 1 (out of 1 available) 19665 1727204163.86971: exiting _queue_task() for managed-node3/debug 19665 1727204163.86985: done queuing things up, now waiting for results queue to drain 19665 1727204163.86986: waiting for pending results... 19665 1727204163.88045: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19665 1727204163.88141: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000028 19665 1727204163.88151: variable 'ansible_search_path' from source: unknown 19665 1727204163.88155: variable 'ansible_search_path' from source: unknown 19665 1727204163.88192: calling self._execute() 19665 1727204163.88281: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204163.88285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204163.88295: variable 'omit' from source: magic vars 19665 1727204163.89378: variable 'ansible_distribution_major_version' from source: facts 19665 1727204163.89391: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204163.89518: variable 'network_state' from source: role '' defaults 19665 1727204163.89528: Evaluated conditional (network_state != {}): False 19665 1727204163.89533: when evaluation is False, skipping this task 19665 1727204163.89536: _execute() done 19665 1727204163.89539: dumping result to json 19665 1727204163.89543: done dumping result, returning 19665 1727204163.89555: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-0dcc-3ea6-000000000028] 19665 1727204163.89558: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000028 19665 1727204163.89649: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000028 19665 1727204163.89654: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 19665 1727204163.89705: no more pending results, returning what we have 19665 1727204163.89709: results queue empty 19665 1727204163.89710: checking for any_errors_fatal 19665 1727204163.89720: done checking for any_errors_fatal 19665 1727204163.89721: checking for max_fail_percentage 19665 1727204163.89723: done checking for max_fail_percentage 19665 1727204163.89724: checking to see if all hosts have failed and the running result is not ok 19665 1727204163.89724: done checking to see if all hosts have failed 19665 1727204163.89725: getting the remaining hosts for this loop 19665 1727204163.89727: done getting the remaining hosts for this loop 19665 1727204163.89731: getting the next task for host managed-node3 19665 1727204163.89737: done getting next task for host managed-node3 19665 1727204163.89742: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 19665 1727204163.89744: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204163.89757: getting variables 19665 1727204163.89759: in VariableManager get_vars() 19665 1727204163.89797: Calling all_inventory to load vars for managed-node3 19665 1727204163.89800: Calling groups_inventory to load vars for managed-node3 19665 1727204163.89802: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204163.89811: Calling all_plugins_play to load vars for managed-node3 19665 1727204163.89813: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204163.89816: Calling groups_plugins_play to load vars for managed-node3 19665 1727204163.92722: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204163.96449: done with get_vars() 19665 1727204163.96490: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:56:03 -0400 (0:00:00.105) 0:00:14.833 ***** 19665 1727204163.96712: entering _queue_task() for managed-node3/ping 19665 1727204163.96714: Creating lock for ping 19665 1727204163.97536: worker is 1 (out of 1 available) 19665 1727204163.97551: exiting _queue_task() for managed-node3/ping 19665 1727204163.97568: done queuing things up, now waiting for results queue to drain 19665 1727204163.97570: waiting for pending results... 19665 1727204163.98343: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 19665 1727204163.98666: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000029 19665 1727204163.98671: variable 'ansible_search_path' from source: unknown 19665 1727204163.98674: variable 'ansible_search_path' from source: unknown 19665 1727204163.98732: calling self._execute() 19665 1727204163.98906: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204163.98910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204163.98923: variable 'omit' from source: magic vars 19665 1727204163.99811: variable 'ansible_distribution_major_version' from source: facts 19665 1727204164.00048: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204164.00053: variable 'omit' from source: magic vars 19665 1727204164.00096: variable 'omit' from source: magic vars 19665 1727204164.00132: variable 'omit' from source: magic vars 19665 1727204164.00401: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204164.00440: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204164.00460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204164.00592: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204164.00603: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204164.00636: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204164.00642: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204164.00645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204164.00857: Set connection var ansible_connection to ssh 19665 1727204164.00865: Set connection var ansible_shell_type to sh 19665 1727204164.00905: Set connection var ansible_timeout to 10 19665 1727204164.00914: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204164.00920: Set connection var ansible_shell_executable to /bin/sh 19665 1727204164.00928: Set connection var ansible_pipelining to False 19665 1727204164.00954: variable 'ansible_shell_executable' from source: unknown 19665 1727204164.00958: variable 'ansible_connection' from source: unknown 19665 1727204164.00961: variable 'ansible_module_compression' from source: unknown 19665 1727204164.00963: variable 'ansible_shell_type' from source: unknown 19665 1727204164.00968: variable 'ansible_shell_executable' from source: unknown 19665 1727204164.00970: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204164.00972: variable 'ansible_pipelining' from source: unknown 19665 1727204164.00974: variable 'ansible_timeout' from source: unknown 19665 1727204164.00980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204164.01522: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204164.01530: variable 'omit' from source: magic vars 19665 1727204164.01540: starting attempt loop 19665 1727204164.01543: running the handler 19665 1727204164.01671: _low_level_execute_command(): starting 19665 1727204164.01680: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204164.03691: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204164.03705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204164.03715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204164.03731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204164.03777: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204164.03785: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204164.03797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204164.03813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204164.03991: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204164.03995: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204164.03997: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204164.03999: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204164.04001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204164.04003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204164.04006: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204164.04008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204164.04574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204164.04578: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204164.04580: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204164.04583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204164.05848: stdout chunk (state=3): >>>/root <<< 19665 1727204164.05983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204164.06041: stderr chunk (state=3): >>><<< 19665 1727204164.06045: stdout chunk (state=3): >>><<< 19665 1727204164.06070: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204164.06085: _low_level_execute_command(): starting 19665 1727204164.06093: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204164.0606976-20839-121757829745409 `" && echo ansible-tmp-1727204164.0606976-20839-121757829745409="` echo /root/.ansible/tmp/ansible-tmp-1727204164.0606976-20839-121757829745409 `" ) && sleep 0' 19665 1727204164.08304: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204164.08535: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204164.08546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204164.08561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204164.08607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204164.08736: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204164.08748: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204164.08762: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204164.08771: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204164.08778: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204164.08786: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204164.08795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204164.08806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204164.08812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204164.08820: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204164.08829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204164.09002: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204164.09183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204164.09193: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204164.09274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204164.11115: stdout chunk (state=3): >>>ansible-tmp-1727204164.0606976-20839-121757829745409=/root/.ansible/tmp/ansible-tmp-1727204164.0606976-20839-121757829745409 <<< 19665 1727204164.11309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204164.11312: stdout chunk (state=3): >>><<< 19665 1727204164.11320: stderr chunk (state=3): >>><<< 19665 1727204164.11359: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204164.0606976-20839-121757829745409=/root/.ansible/tmp/ansible-tmp-1727204164.0606976-20839-121757829745409 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204164.11408: variable 'ansible_module_compression' from source: unknown 19665 1727204164.11454: ANSIBALLZ: Using lock for ping 19665 1727204164.11459: ANSIBALLZ: Acquiring lock 19665 1727204164.11462: ANSIBALLZ: Lock acquired: 140619595638320 19665 1727204164.11464: ANSIBALLZ: Creating module 19665 1727204164.33564: ANSIBALLZ: Writing module into payload 19665 1727204164.33633: ANSIBALLZ: Writing module 19665 1727204164.33656: ANSIBALLZ: Renaming module 19665 1727204164.33662: ANSIBALLZ: Done creating module 19665 1727204164.33682: variable 'ansible_facts' from source: unknown 19665 1727204164.33874: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204164.0606976-20839-121757829745409/AnsiballZ_ping.py 19665 1727204164.34004: Sending initial data 19665 1727204164.34007: Sent initial data (153 bytes) 19665 1727204164.35708: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204164.35716: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204164.35753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204164.35767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204164.35834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204164.35845: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204164.35854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204164.35868: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204164.35874: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204164.35881: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204164.35889: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204164.35898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204164.35909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204164.35916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204164.35923: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204164.35932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204164.36016: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204164.36034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204164.36048: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204164.36121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204164.37937: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204164.37980: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204164.38044: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp7_x27s2p /root/.ansible/tmp/ansible-tmp-1727204164.0606976-20839-121757829745409/AnsiballZ_ping.py <<< 19665 1727204164.38077: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204164.39355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204164.39566: stderr chunk (state=3): >>><<< 19665 1727204164.39580: stdout chunk (state=3): >>><<< 19665 1727204164.39615: done transferring module to remote 19665 1727204164.39620: _low_level_execute_command(): starting 19665 1727204164.39638: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204164.0606976-20839-121757829745409/ /root/.ansible/tmp/ansible-tmp-1727204164.0606976-20839-121757829745409/AnsiballZ_ping.py && sleep 0' 19665 1727204164.42299: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204164.42307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204164.42319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204164.42332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204164.42376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204164.42382: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204164.42392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204164.42405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204164.42413: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204164.42420: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204164.42428: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204164.42436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204164.42453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204164.42489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204164.42495: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204164.42505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204164.42593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204164.42597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204164.42603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204164.42683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204164.44570: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204164.45893: stderr chunk (state=3): >>><<< 19665 1727204164.45899: stdout chunk (state=3): >>><<< 19665 1727204164.45921: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204164.45924: _low_level_execute_command(): starting 19665 1727204164.45929: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204164.0606976-20839-121757829745409/AnsiballZ_ping.py && sleep 0' 19665 1727204164.49096: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204164.49100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204164.49144: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204164.49150: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204164.49171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204164.49178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204164.49315: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204164.49366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204164.49484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204164.49641: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204164.62575: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 19665 1727204164.63623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204164.63628: stdout chunk (state=3): >>><<< 19665 1727204164.63630: stderr chunk (state=3): >>><<< 19665 1727204164.63768: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204164.63772: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204164.0606976-20839-121757829745409/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204164.63778: _low_level_execute_command(): starting 19665 1727204164.63780: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204164.0606976-20839-121757829745409/ > /dev/null 2>&1 && sleep 0' 19665 1727204164.64430: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204164.64483: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204164.64625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204164.64730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204164.64808: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204164.64822: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204164.64837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204164.64859: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204164.64882: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204164.64897: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204164.64910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204164.64925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204164.64941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204164.64954: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204164.64970: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204164.65030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204164.67627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204164.67773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204164.67789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204164.67947: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204164.69805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204164.69810: stdout chunk (state=3): >>><<< 19665 1727204164.69813: stderr chunk (state=3): >>><<< 19665 1727204164.70072: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204164.70075: handler run complete 19665 1727204164.70078: attempt loop complete, returning result 19665 1727204164.70080: _execute() done 19665 1727204164.70082: dumping result to json 19665 1727204164.70084: done dumping result, returning 19665 1727204164.70086: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-0dcc-3ea6-000000000029] 19665 1727204164.70088: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000029 19665 1727204164.70159: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000029 19665 1727204164.70163: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 19665 1727204164.70221: no more pending results, returning what we have 19665 1727204164.70225: results queue empty 19665 1727204164.70226: checking for any_errors_fatal 19665 1727204164.70233: done checking for any_errors_fatal 19665 1727204164.70234: checking for max_fail_percentage 19665 1727204164.70237: done checking for max_fail_percentage 19665 1727204164.70238: checking to see if all hosts have failed and the running result is not ok 19665 1727204164.70239: done checking to see if all hosts have failed 19665 1727204164.70239: getting the remaining hosts for this loop 19665 1727204164.70241: done getting the remaining hosts for this loop 19665 1727204164.70245: getting the next task for host managed-node3 19665 1727204164.70253: done getting next task for host managed-node3 19665 1727204164.70255: ^ task is: TASK: meta (role_complete) 19665 1727204164.70257: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204164.70277: getting variables 19665 1727204164.70279: in VariableManager get_vars() 19665 1727204164.70312: Calling all_inventory to load vars for managed-node3 19665 1727204164.70315: Calling groups_inventory to load vars for managed-node3 19665 1727204164.70317: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204164.70327: Calling all_plugins_play to load vars for managed-node3 19665 1727204164.70329: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204164.70332: Calling groups_plugins_play to load vars for managed-node3 19665 1727204164.73251: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204164.79900: done with get_vars() 19665 1727204164.79938: done getting variables 19665 1727204164.80267: done queuing things up, now waiting for results queue to drain 19665 1727204164.80269: results queue empty 19665 1727204164.80270: checking for any_errors_fatal 19665 1727204164.80274: done checking for any_errors_fatal 19665 1727204164.80275: checking for max_fail_percentage 19665 1727204164.80276: done checking for max_fail_percentage 19665 1727204164.80277: checking to see if all hosts have failed and the running result is not ok 19665 1727204164.80278: done checking to see if all hosts have failed 19665 1727204164.80279: getting the remaining hosts for this loop 19665 1727204164.80280: done getting the remaining hosts for this loop 19665 1727204164.80283: getting the next task for host managed-node3 19665 1727204164.80287: done getting next task for host managed-node3 19665 1727204164.80288: ^ task is: TASK: meta (flush_handlers) 19665 1727204164.80290: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204164.80293: getting variables 19665 1727204164.80295: in VariableManager get_vars() 19665 1727204164.80475: Calling all_inventory to load vars for managed-node3 19665 1727204164.80479: Calling groups_inventory to load vars for managed-node3 19665 1727204164.80481: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204164.80491: Calling all_plugins_play to load vars for managed-node3 19665 1727204164.80494: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204164.80497: Calling groups_plugins_play to load vars for managed-node3 19665 1727204164.83947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204164.88425: done with get_vars() 19665 1727204164.89162: done getting variables 19665 1727204164.90357: in VariableManager get_vars() 19665 1727204164.90485: Calling all_inventory to load vars for managed-node3 19665 1727204164.90488: Calling groups_inventory to load vars for managed-node3 19665 1727204164.90831: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204164.90839: Calling all_plugins_play to load vars for managed-node3 19665 1727204164.90841: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204164.90844: Calling groups_plugins_play to load vars for managed-node3 19665 1727204164.92761: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204164.96557: done with get_vars() 19665 1727204164.96594: done queuing things up, now waiting for results queue to drain 19665 1727204164.96596: results queue empty 19665 1727204164.96597: checking for any_errors_fatal 19665 1727204164.96598: done checking for any_errors_fatal 19665 1727204164.96599: checking for max_fail_percentage 19665 1727204164.96600: done checking for max_fail_percentage 19665 1727204164.96601: checking to see if all hosts have failed and the running result is not ok 19665 1727204164.96601: done checking to see if all hosts have failed 19665 1727204164.96602: getting the remaining hosts for this loop 19665 1727204164.96603: done getting the remaining hosts for this loop 19665 1727204164.96606: getting the next task for host managed-node3 19665 1727204164.96610: done getting next task for host managed-node3 19665 1727204164.96611: ^ task is: TASK: meta (flush_handlers) 19665 1727204164.96613: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204164.96615: getting variables 19665 1727204164.96616: in VariableManager get_vars() 19665 1727204164.96659: Calling all_inventory to load vars for managed-node3 19665 1727204164.96662: Calling groups_inventory to load vars for managed-node3 19665 1727204164.96666: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204164.96672: Calling all_plugins_play to load vars for managed-node3 19665 1727204164.96674: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204164.96677: Calling groups_plugins_play to load vars for managed-node3 19665 1727204164.99219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204165.02615: done with get_vars() 19665 1727204165.02648: done getting variables 19665 1727204165.02709: in VariableManager get_vars() 19665 1727204165.02724: Calling all_inventory to load vars for managed-node3 19665 1727204165.02726: Calling groups_inventory to load vars for managed-node3 19665 1727204165.02728: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204165.02733: Calling all_plugins_play to load vars for managed-node3 19665 1727204165.02740: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204165.02743: Calling groups_plugins_play to load vars for managed-node3 19665 1727204165.07069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204165.09895: done with get_vars() 19665 1727204165.09933: done queuing things up, now waiting for results queue to drain 19665 1727204165.09935: results queue empty 19665 1727204165.09936: checking for any_errors_fatal 19665 1727204165.09937: done checking for any_errors_fatal 19665 1727204165.09938: checking for max_fail_percentage 19665 1727204165.09939: done checking for max_fail_percentage 19665 1727204165.09939: checking to see if all hosts have failed and the running result is not ok 19665 1727204165.09940: done checking to see if all hosts have failed 19665 1727204165.09941: getting the remaining hosts for this loop 19665 1727204165.09942: done getting the remaining hosts for this loop 19665 1727204165.09945: getting the next task for host managed-node3 19665 1727204165.09949: done getting next task for host managed-node3 19665 1727204165.09949: ^ task is: None 19665 1727204165.09951: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204165.09952: done queuing things up, now waiting for results queue to drain 19665 1727204165.09953: results queue empty 19665 1727204165.09954: checking for any_errors_fatal 19665 1727204165.09954: done checking for any_errors_fatal 19665 1727204165.09955: checking for max_fail_percentage 19665 1727204165.09956: done checking for max_fail_percentage 19665 1727204165.09956: checking to see if all hosts have failed and the running result is not ok 19665 1727204165.09957: done checking to see if all hosts have failed 19665 1727204165.09958: getting the next task for host managed-node3 19665 1727204165.09960: done getting next task for host managed-node3 19665 1727204165.09961: ^ task is: None 19665 1727204165.09962: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204165.10071: in VariableManager get_vars() 19665 1727204165.10090: done with get_vars() 19665 1727204165.10097: in VariableManager get_vars() 19665 1727204165.10106: done with get_vars() 19665 1727204165.10110: variable 'omit' from source: magic vars 19665 1727204165.10229: variable 'task' from source: play vars 19665 1727204165.10261: in VariableManager get_vars() 19665 1727204165.10275: done with get_vars() 19665 1727204165.10295: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_present.yml] ************************ 19665 1727204165.10788: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19665 1727204165.10813: getting the remaining hosts for this loop 19665 1727204165.10814: done getting the remaining hosts for this loop 19665 1727204165.10818: getting the next task for host managed-node3 19665 1727204165.10821: done getting next task for host managed-node3 19665 1727204165.10823: ^ task is: TASK: Gathering Facts 19665 1727204165.10824: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204165.10826: getting variables 19665 1727204165.10828: in VariableManager get_vars() 19665 1727204165.10837: Calling all_inventory to load vars for managed-node3 19665 1727204165.10840: Calling groups_inventory to load vars for managed-node3 19665 1727204165.10842: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204165.10848: Calling all_plugins_play to load vars for managed-node3 19665 1727204165.10850: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204165.10853: Calling groups_plugins_play to load vars for managed-node3 19665 1727204165.14680: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204165.16649: done with get_vars() 19665 1727204165.16680: done getting variables 19665 1727204165.16724: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:56:05 -0400 (0:00:01.200) 0:00:16.034 ***** 19665 1727204165.16749: entering _queue_task() for managed-node3/gather_facts 19665 1727204165.18076: worker is 1 (out of 1 available) 19665 1727204165.18089: exiting _queue_task() for managed-node3/gather_facts 19665 1727204165.18101: done queuing things up, now waiting for results queue to drain 19665 1727204165.18103: waiting for pending results... 19665 1727204165.18920: running TaskExecutor() for managed-node3/TASK: Gathering Facts 19665 1727204165.19039: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000219 19665 1727204165.19128: variable 'ansible_search_path' from source: unknown 19665 1727204165.19169: calling self._execute() 19665 1727204165.19267: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204165.19280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204165.19295: variable 'omit' from source: magic vars 19665 1727204165.19906: variable 'ansible_distribution_major_version' from source: facts 19665 1727204165.20003: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204165.20013: variable 'omit' from source: magic vars 19665 1727204165.20045: variable 'omit' from source: magic vars 19665 1727204165.20088: variable 'omit' from source: magic vars 19665 1727204165.20215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204165.20256: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204165.20337: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204165.20360: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204165.20437: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204165.20537: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204165.20546: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204165.20553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204165.20770: Set connection var ansible_connection to ssh 19665 1727204165.20783: Set connection var ansible_shell_type to sh 19665 1727204165.20793: Set connection var ansible_timeout to 10 19665 1727204165.20802: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204165.20813: Set connection var ansible_shell_executable to /bin/sh 19665 1727204165.20826: Set connection var ansible_pipelining to False 19665 1727204165.20913: variable 'ansible_shell_executable' from source: unknown 19665 1727204165.20921: variable 'ansible_connection' from source: unknown 19665 1727204165.20928: variable 'ansible_module_compression' from source: unknown 19665 1727204165.20934: variable 'ansible_shell_type' from source: unknown 19665 1727204165.20940: variable 'ansible_shell_executable' from source: unknown 19665 1727204165.20946: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204165.20953: variable 'ansible_pipelining' from source: unknown 19665 1727204165.20979: variable 'ansible_timeout' from source: unknown 19665 1727204165.20989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204165.21438: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204165.21454: variable 'omit' from source: magic vars 19665 1727204165.21463: starting attempt loop 19665 1727204165.21473: running the handler 19665 1727204165.21491: variable 'ansible_facts' from source: unknown 19665 1727204165.21518: _low_level_execute_command(): starting 19665 1727204165.21581: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204165.23056: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204165.23523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204165.23541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204165.23560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204165.23608: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204165.23621: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204165.23635: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204165.23654: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204165.23669: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204165.23680: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204165.23693: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204165.23709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204165.23726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204165.23742: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204165.23758: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204165.23779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204165.23856: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204165.23922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204165.23940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204165.24091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204165.25677: stdout chunk (state=3): >>>/root <<< 19665 1727204165.25871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204165.25875: stdout chunk (state=3): >>><<< 19665 1727204165.25877: stderr chunk (state=3): >>><<< 19665 1727204165.25998: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204165.26002: _low_level_execute_command(): starting 19665 1727204165.26005: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204165.2589989-20934-277095670414192 `" && echo ansible-tmp-1727204165.2589989-20934-277095670414192="` echo /root/.ansible/tmp/ansible-tmp-1727204165.2589989-20934-277095670414192 `" ) && sleep 0' 19665 1727204165.26927: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204165.26948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204165.26966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204165.26990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204165.27352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204165.27367: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204165.27382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204165.27399: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204165.27412: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204165.27427: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204165.27441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204165.27456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204165.27474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204165.27486: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204165.27498: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204165.27511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204165.27595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204165.27619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204165.27636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204165.27714: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204165.29535: stdout chunk (state=3): >>>ansible-tmp-1727204165.2589989-20934-277095670414192=/root/.ansible/tmp/ansible-tmp-1727204165.2589989-20934-277095670414192 <<< 19665 1727204165.29743: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204165.29746: stdout chunk (state=3): >>><<< 19665 1727204165.29748: stderr chunk (state=3): >>><<< 19665 1727204165.29977: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204165.2589989-20934-277095670414192=/root/.ansible/tmp/ansible-tmp-1727204165.2589989-20934-277095670414192 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204165.29980: variable 'ansible_module_compression' from source: unknown 19665 1727204165.29983: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19665 1727204165.29985: variable 'ansible_facts' from source: unknown 19665 1727204165.30157: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204165.2589989-20934-277095670414192/AnsiballZ_setup.py 19665 1727204165.30770: Sending initial data 19665 1727204165.30781: Sent initial data (154 bytes) 19665 1727204165.32277: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204165.32293: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204165.32308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204165.32324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204165.32371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204165.32383: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204165.32401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204165.32419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204165.32431: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204165.32447: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204165.32461: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204165.32482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204165.32498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204165.32511: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204165.32524: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204165.32542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204165.32623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204165.32644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204165.32659: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204165.32980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204165.34530: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204165.34562: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204165.34600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp5aozui1s /root/.ansible/tmp/ansible-tmp-1727204165.2589989-20934-277095670414192/AnsiballZ_setup.py <<< 19665 1727204165.34635: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204165.38206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204165.38560: stderr chunk (state=3): >>><<< 19665 1727204165.38571: stdout chunk (state=3): >>><<< 19665 1727204165.38574: done transferring module to remote 19665 1727204165.38586: _low_level_execute_command(): starting 19665 1727204165.38590: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204165.2589989-20934-277095670414192/ /root/.ansible/tmp/ansible-tmp-1727204165.2589989-20934-277095670414192/AnsiballZ_setup.py && sleep 0' 19665 1727204165.40379: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204165.40417: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204165.40442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204165.40487: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204165.40568: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204165.40643: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204165.40658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204165.40678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204165.40690: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204165.40700: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204165.40712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204165.40735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204165.40751: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204165.40762: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204165.40775: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204165.40788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204165.40877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204165.40894: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204165.40908: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204165.41022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204165.42744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204165.42851: stderr chunk (state=3): >>><<< 19665 1727204165.42898: stdout chunk (state=3): >>><<< 19665 1727204165.42918: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204165.42921: _low_level_execute_command(): starting 19665 1727204165.42924: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204165.2589989-20934-277095670414192/AnsiballZ_setup.py && sleep 0' 19665 1727204165.43623: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204165.43635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204165.43662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204165.43677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204165.43731: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204165.43746: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204165.43773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204165.43797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204165.43817: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204165.43832: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204165.43845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204165.43860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204165.43883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204165.43886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204165.43962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204165.43997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204165.44017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204165.44372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204165.95017: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare}<<< 19665 1727204165.95041: stdout chunk (state=3): >>> ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "05", "epoch": "1727204165", "epoch_int": "1727204165", "date": "2024-09-24", "time": "14:56:05", "iso8601_micro": "2024-09-24T18:56:05.681084Z", "iso8601": "2024-09-24T18:56:05Z", "iso8601_basic": "20240924T145605681084", "iso8601_basic_short": "20240924T145605", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.31, "5m": 0.33, "15m": 0.16}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["LSR-TST-br31", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "<<< 19665 1727204165.95079: stdout chunk (state=3): >>>off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "ae:40:77:00:3f:d3", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3274, "used": 258}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 511, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282099712, "block_size": 4096, "block_total": 65519355, "block_available": 64521997, "block_used": 997358, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19665 1727204165.96668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204165.96754: stderr chunk (state=3): >>><<< 19665 1727204165.96758: stdout chunk (state=3): >>><<< 19665 1727204165.96977: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_iscsi_iqn": "", "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "05", "epoch": "1727204165", "epoch_int": "1727204165", "date": "2024-09-24", "time": "14:56:05", "iso8601_micro": "2024-09-24T18:56:05.681084Z", "iso8601": "2024-09-24T18:56:05Z", "iso8601_basic": "20240924T145605681084", "iso8601_basic_short": "20240924T145605", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.31, "5m": 0.33, "15m": 0.16}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["LSR-TST-br31", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "ae:40:77:00:3f:d3", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2815, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 717, "free": 2815}, "nocache": {"free": 3274, "used": 258}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 511, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282099712, "block_size": 4096, "block_total": 65519355, "block_available": 64521997, "block_used": 997358, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204165.97255: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204165.2589989-20934-277095670414192/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204165.97286: _low_level_execute_command(): starting 19665 1727204165.97304: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204165.2589989-20934-277095670414192/ > /dev/null 2>&1 && sleep 0' 19665 1727204165.99244: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204165.99377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204165.99394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204165.99414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204165.99460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204165.99640: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204165.99656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204165.99678: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204165.99696: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204165.99710: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204165.99725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204165.99741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204165.99758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204165.99773: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204165.99785: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204165.99804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204166.00048: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204166.00066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204166.00081: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204166.00208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204166.02147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204166.02151: stdout chunk (state=3): >>><<< 19665 1727204166.02153: stderr chunk (state=3): >>><<< 19665 1727204166.02374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204166.02378: handler run complete 19665 1727204166.02380: variable 'ansible_facts' from source: unknown 19665 1727204166.02458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204166.02968: variable 'ansible_facts' from source: unknown 19665 1727204166.03179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204166.03447: attempt loop complete, returning result 19665 1727204166.03547: _execute() done 19665 1727204166.03555: dumping result to json 19665 1727204166.03715: done dumping result, returning 19665 1727204166.03786: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-0dcc-3ea6-000000000219] 19665 1727204166.03804: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000219 ok: [managed-node3] 19665 1727204166.05315: no more pending results, returning what we have 19665 1727204166.05319: results queue empty 19665 1727204166.05320: checking for any_errors_fatal 19665 1727204166.05322: done checking for any_errors_fatal 19665 1727204166.05323: checking for max_fail_percentage 19665 1727204166.05324: done checking for max_fail_percentage 19665 1727204166.05325: checking to see if all hosts have failed and the running result is not ok 19665 1727204166.05326: done checking to see if all hosts have failed 19665 1727204166.05327: getting the remaining hosts for this loop 19665 1727204166.05328: done getting the remaining hosts for this loop 19665 1727204166.05333: getting the next task for host managed-node3 19665 1727204166.05343: done getting next task for host managed-node3 19665 1727204166.05345: ^ task is: TASK: meta (flush_handlers) 19665 1727204166.05347: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204166.05351: getting variables 19665 1727204166.05353: in VariableManager get_vars() 19665 1727204166.05379: Calling all_inventory to load vars for managed-node3 19665 1727204166.05381: Calling groups_inventory to load vars for managed-node3 19665 1727204166.05389: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204166.05401: Calling all_plugins_play to load vars for managed-node3 19665 1727204166.05404: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204166.05407: Calling groups_plugins_play to load vars for managed-node3 19665 1727204166.06571: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000219 19665 1727204166.06575: WORKER PROCESS EXITING 19665 1727204166.08381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204166.11089: done with get_vars() 19665 1727204166.11118: done getting variables 19665 1727204166.11200: in VariableManager get_vars() 19665 1727204166.11212: Calling all_inventory to load vars for managed-node3 19665 1727204166.11215: Calling groups_inventory to load vars for managed-node3 19665 1727204166.11218: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204166.11223: Calling all_plugins_play to load vars for managed-node3 19665 1727204166.11225: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204166.11233: Calling groups_plugins_play to load vars for managed-node3 19665 1727204166.12553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204166.26166: done with get_vars() 19665 1727204166.26909: done queuing things up, now waiting for results queue to drain 19665 1727204166.26912: results queue empty 19665 1727204166.26913: checking for any_errors_fatal 19665 1727204166.26917: done checking for any_errors_fatal 19665 1727204166.26918: checking for max_fail_percentage 19665 1727204166.26919: done checking for max_fail_percentage 19665 1727204166.26920: checking to see if all hosts have failed and the running result is not ok 19665 1727204166.26921: done checking to see if all hosts have failed 19665 1727204166.26921: getting the remaining hosts for this loop 19665 1727204166.26922: done getting the remaining hosts for this loop 19665 1727204166.26925: getting the next task for host managed-node3 19665 1727204166.26929: done getting next task for host managed-node3 19665 1727204166.26932: ^ task is: TASK: Include the task '{{ task }}' 19665 1727204166.26933: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204166.26936: getting variables 19665 1727204166.26937: in VariableManager get_vars() 19665 1727204166.27107: Calling all_inventory to load vars for managed-node3 19665 1727204166.27110: Calling groups_inventory to load vars for managed-node3 19665 1727204166.27113: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204166.27119: Calling all_plugins_play to load vars for managed-node3 19665 1727204166.27122: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204166.27178: Calling groups_plugins_play to load vars for managed-node3 19665 1727204166.29593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204166.31598: done with get_vars() 19665 1727204166.31623: done getting variables 19665 1727204166.31802: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_present.yml'] ********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:56:06 -0400 (0:00:01.150) 0:00:17.184 ***** 19665 1727204166.31832: entering _queue_task() for managed-node3/include_tasks 19665 1727204166.32294: worker is 1 (out of 1 available) 19665 1727204166.32307: exiting _queue_task() for managed-node3/include_tasks 19665 1727204166.32320: done queuing things up, now waiting for results queue to drain 19665 1727204166.32327: waiting for pending results... 19665 1727204166.32656: running TaskExecutor() for managed-node3/TASK: Include the task 'tasks/assert_device_present.yml' 19665 1727204166.32790: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000002d 19665 1727204166.32960: variable 'ansible_search_path' from source: unknown 19665 1727204166.33487: calling self._execute() 19665 1727204166.33631: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204166.33645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204166.33670: variable 'omit' from source: magic vars 19665 1727204166.34577: variable 'ansible_distribution_major_version' from source: facts 19665 1727204166.34604: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204166.34619: variable 'task' from source: play vars 19665 1727204166.34732: variable 'task' from source: play vars 19665 1727204166.34750: _execute() done 19665 1727204166.34758: dumping result to json 19665 1727204166.34768: done dumping result, returning 19665 1727204166.34790: done running TaskExecutor() for managed-node3/TASK: Include the task 'tasks/assert_device_present.yml' [0affcd87-79f5-0dcc-3ea6-00000000002d] 19665 1727204166.34804: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000002d 19665 1727204166.34925: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000002d 19665 1727204166.34928: WORKER PROCESS EXITING 19665 1727204166.34974: no more pending results, returning what we have 19665 1727204166.34980: in VariableManager get_vars() 19665 1727204166.35035: Calling all_inventory to load vars for managed-node3 19665 1727204166.35041: Calling groups_inventory to load vars for managed-node3 19665 1727204166.35047: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204166.35063: Calling all_plugins_play to load vars for managed-node3 19665 1727204166.35069: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204166.35073: Calling groups_plugins_play to load vars for managed-node3 19665 1727204166.37694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204166.40082: done with get_vars() 19665 1727204166.40108: variable 'ansible_search_path' from source: unknown 19665 1727204166.40125: we have included files to process 19665 1727204166.40126: generating all_blocks data 19665 1727204166.40127: done generating all_blocks data 19665 1727204166.40128: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 19665 1727204166.40133: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 19665 1727204166.40139: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 19665 1727204166.40319: in VariableManager get_vars() 19665 1727204166.40336: done with get_vars() 19665 1727204166.40456: done processing included file 19665 1727204166.40458: iterating over new_blocks loaded from include file 19665 1727204166.40459: in VariableManager get_vars() 19665 1727204166.40471: done with get_vars() 19665 1727204166.40472: filtering new block on tags 19665 1727204166.40488: done filtering new block on tags 19665 1727204166.40490: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node3 19665 1727204166.40495: extending task lists for all hosts with included blocks 19665 1727204166.40525: done extending task lists 19665 1727204166.40527: done processing included files 19665 1727204166.40527: results queue empty 19665 1727204166.40528: checking for any_errors_fatal 19665 1727204166.40530: done checking for any_errors_fatal 19665 1727204166.40531: checking for max_fail_percentage 19665 1727204166.40532: done checking for max_fail_percentage 19665 1727204166.40532: checking to see if all hosts have failed and the running result is not ok 19665 1727204166.40533: done checking to see if all hosts have failed 19665 1727204166.40534: getting the remaining hosts for this loop 19665 1727204166.40535: done getting the remaining hosts for this loop 19665 1727204166.40540: getting the next task for host managed-node3 19665 1727204166.40544: done getting next task for host managed-node3 19665 1727204166.40546: ^ task is: TASK: Include the task 'get_interface_stat.yml' 19665 1727204166.40548: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204166.40550: getting variables 19665 1727204166.40551: in VariableManager get_vars() 19665 1727204166.40560: Calling all_inventory to load vars for managed-node3 19665 1727204166.40563: Calling groups_inventory to load vars for managed-node3 19665 1727204166.40566: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204166.40571: Calling all_plugins_play to load vars for managed-node3 19665 1727204166.40573: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204166.40575: Calling groups_plugins_play to load vars for managed-node3 19665 1727204166.44420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204166.48054: done with get_vars() 19665 1727204166.48095: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:56:06 -0400 (0:00:00.165) 0:00:17.350 ***** 19665 1727204166.48401: entering _queue_task() for managed-node3/include_tasks 19665 1727204166.48653: worker is 1 (out of 1 available) 19665 1727204166.48668: exiting _queue_task() for managed-node3/include_tasks 19665 1727204166.48680: done queuing things up, now waiting for results queue to drain 19665 1727204166.48682: waiting for pending results... 19665 1727204166.48892: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 19665 1727204166.49035: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000022a 19665 1727204166.49055: variable 'ansible_search_path' from source: unknown 19665 1727204166.49074: variable 'ansible_search_path' from source: unknown 19665 1727204166.49129: calling self._execute() 19665 1727204166.49272: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204166.49673: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204166.49682: variable 'omit' from source: magic vars 19665 1727204166.50322: variable 'ansible_distribution_major_version' from source: facts 19665 1727204166.50333: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204166.50345: _execute() done 19665 1727204166.50353: dumping result to json 19665 1727204166.50361: done dumping result, returning 19665 1727204166.50374: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-0dcc-3ea6-00000000022a] 19665 1727204166.50385: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000022a 19665 1727204166.50530: no more pending results, returning what we have 19665 1727204166.50537: in VariableManager get_vars() 19665 1727204166.50577: Calling all_inventory to load vars for managed-node3 19665 1727204166.50579: Calling groups_inventory to load vars for managed-node3 19665 1727204166.50583: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204166.50596: Calling all_plugins_play to load vars for managed-node3 19665 1727204166.50599: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204166.50602: Calling groups_plugins_play to load vars for managed-node3 19665 1727204166.52026: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000022a 19665 1727204166.52031: WORKER PROCESS EXITING 19665 1727204166.52821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204166.55753: done with get_vars() 19665 1727204166.55784: variable 'ansible_search_path' from source: unknown 19665 1727204166.55785: variable 'ansible_search_path' from source: unknown 19665 1727204166.55911: variable 'task' from source: play vars 19665 1727204166.56143: variable 'task' from source: play vars 19665 1727204166.56179: we have included files to process 19665 1727204166.56181: generating all_blocks data 19665 1727204166.56182: done generating all_blocks data 19665 1727204166.56183: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19665 1727204166.56185: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19665 1727204166.56186: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19665 1727204166.56549: done processing included file 19665 1727204166.56666: iterating over new_blocks loaded from include file 19665 1727204166.56668: in VariableManager get_vars() 19665 1727204166.56686: done with get_vars() 19665 1727204166.56688: filtering new block on tags 19665 1727204166.56707: done filtering new block on tags 19665 1727204166.56709: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 19665 1727204166.56714: extending task lists for all hosts with included blocks 19665 1727204166.56945: done extending task lists 19665 1727204166.56947: done processing included files 19665 1727204166.56948: results queue empty 19665 1727204166.56948: checking for any_errors_fatal 19665 1727204166.56952: done checking for any_errors_fatal 19665 1727204166.56952: checking for max_fail_percentage 19665 1727204166.56954: done checking for max_fail_percentage 19665 1727204166.56955: checking to see if all hosts have failed and the running result is not ok 19665 1727204166.56955: done checking to see if all hosts have failed 19665 1727204166.56956: getting the remaining hosts for this loop 19665 1727204166.56957: done getting the remaining hosts for this loop 19665 1727204166.56960: getting the next task for host managed-node3 19665 1727204166.56965: done getting next task for host managed-node3 19665 1727204166.56968: ^ task is: TASK: Get stat for interface {{ interface }} 19665 1727204166.56971: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204166.56973: getting variables 19665 1727204166.56974: in VariableManager get_vars() 19665 1727204166.57098: Calling all_inventory to load vars for managed-node3 19665 1727204166.57101: Calling groups_inventory to load vars for managed-node3 19665 1727204166.57104: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204166.57111: Calling all_plugins_play to load vars for managed-node3 19665 1727204166.57113: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204166.57116: Calling groups_plugins_play to load vars for managed-node3 19665 1727204166.58594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204166.60813: done with get_vars() 19665 1727204166.60956: done getting variables 19665 1727204166.61214: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:56:06 -0400 (0:00:00.128) 0:00:17.479 ***** 19665 1727204166.61244: entering _queue_task() for managed-node3/stat 19665 1727204166.62699: worker is 1 (out of 1 available) 19665 1727204166.62710: exiting _queue_task() for managed-node3/stat 19665 1727204166.62722: done queuing things up, now waiting for results queue to drain 19665 1727204166.62723: waiting for pending results... 19665 1727204166.63535: running TaskExecutor() for managed-node3/TASK: Get stat for interface LSR-TST-br31 19665 1727204166.63903: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000235 19665 1727204166.63993: variable 'ansible_search_path' from source: unknown 19665 1727204166.64001: variable 'ansible_search_path' from source: unknown 19665 1727204166.64044: calling self._execute() 19665 1727204166.64285: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204166.64296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204166.64319: variable 'omit' from source: magic vars 19665 1727204166.65050: variable 'ansible_distribution_major_version' from source: facts 19665 1727204166.65201: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204166.65216: variable 'omit' from source: magic vars 19665 1727204166.65295: variable 'omit' from source: magic vars 19665 1727204166.65542: variable 'interface' from source: set_fact 19665 1727204166.65567: variable 'omit' from source: magic vars 19665 1727204166.65730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204166.65778: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204166.65809: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204166.65851: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204166.65960: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204166.66001: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204166.66011: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204166.66020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204166.66138: Set connection var ansible_connection to ssh 19665 1727204166.66281: Set connection var ansible_shell_type to sh 19665 1727204166.66294: Set connection var ansible_timeout to 10 19665 1727204166.66303: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204166.66317: Set connection var ansible_shell_executable to /bin/sh 19665 1727204166.66328: Set connection var ansible_pipelining to False 19665 1727204166.66355: variable 'ansible_shell_executable' from source: unknown 19665 1727204166.66489: variable 'ansible_connection' from source: unknown 19665 1727204166.66501: variable 'ansible_module_compression' from source: unknown 19665 1727204166.66509: variable 'ansible_shell_type' from source: unknown 19665 1727204166.66516: variable 'ansible_shell_executable' from source: unknown 19665 1727204166.66523: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204166.66531: variable 'ansible_pipelining' from source: unknown 19665 1727204166.66538: variable 'ansible_timeout' from source: unknown 19665 1727204166.66546: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204166.66912: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204166.66981: variable 'omit' from source: magic vars 19665 1727204166.66992: starting attempt loop 19665 1727204166.67036: running the handler 19665 1727204166.67058: _low_level_execute_command(): starting 19665 1727204166.67073: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204166.69048: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204166.69115: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204166.69133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204166.69154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204166.69206: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204166.69335: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204166.69351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204166.69374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204166.69387: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204166.69399: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204166.69413: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204166.69436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204166.69454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204166.69470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204166.69482: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204166.69496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204166.69670: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204166.69689: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204166.69705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204166.69883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204166.71473: stdout chunk (state=3): >>>/root <<< 19665 1727204166.71673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204166.71676: stdout chunk (state=3): >>><<< 19665 1727204166.71680: stderr chunk (state=3): >>><<< 19665 1727204166.71810: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204166.71815: _low_level_execute_command(): starting 19665 1727204166.71819: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204166.7170706-21126-236360075209871 `" && echo ansible-tmp-1727204166.7170706-21126-236360075209871="` echo /root/.ansible/tmp/ansible-tmp-1727204166.7170706-21126-236360075209871 `" ) && sleep 0' 19665 1727204166.73281: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204166.73340: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204166.73357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204166.73380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204166.73423: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204166.73479: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204166.73492: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204166.73511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204166.73523: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204166.73537: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204166.73555: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204166.73573: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204166.73669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204166.73684: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204166.73694: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204166.73706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204166.73788: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204166.73888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204166.73903: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204166.74081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204166.75961: stdout chunk (state=3): >>>ansible-tmp-1727204166.7170706-21126-236360075209871=/root/.ansible/tmp/ansible-tmp-1727204166.7170706-21126-236360075209871 <<< 19665 1727204166.76157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204166.76161: stdout chunk (state=3): >>><<< 19665 1727204166.76166: stderr chunk (state=3): >>><<< 19665 1727204166.76382: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204166.7170706-21126-236360075209871=/root/.ansible/tmp/ansible-tmp-1727204166.7170706-21126-236360075209871 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204166.76386: variable 'ansible_module_compression' from source: unknown 19665 1727204166.76388: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 19665 1727204166.76391: variable 'ansible_facts' from source: unknown 19665 1727204166.76429: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204166.7170706-21126-236360075209871/AnsiballZ_stat.py 19665 1727204166.76789: Sending initial data 19665 1727204166.76792: Sent initial data (153 bytes) 19665 1727204166.78039: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204166.78043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204166.78094: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204166.78098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204166.78101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204166.78103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204166.78176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204166.78180: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204166.78182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204166.78241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204166.79954: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204166.79998: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204166.80041: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpi9iul2xg /root/.ansible/tmp/ansible-tmp-1727204166.7170706-21126-236360075209871/AnsiballZ_stat.py <<< 19665 1727204166.80157: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204166.81385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204166.81532: stderr chunk (state=3): >>><<< 19665 1727204166.81536: stdout chunk (state=3): >>><<< 19665 1727204166.81557: done transferring module to remote 19665 1727204166.81570: _low_level_execute_command(): starting 19665 1727204166.81576: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204166.7170706-21126-236360075209871/ /root/.ansible/tmp/ansible-tmp-1727204166.7170706-21126-236360075209871/AnsiballZ_stat.py && sleep 0' 19665 1727204166.82216: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204166.82224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204166.82234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204166.82248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204166.82286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204166.82293: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204166.82309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204166.82321: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204166.82328: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204166.82335: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204166.82343: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204166.82351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204166.82361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204166.82371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204166.82377: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204166.82386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204166.82462: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204166.82482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204166.82492: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204166.82558: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204166.84355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204166.84358: stdout chunk (state=3): >>><<< 19665 1727204166.84366: stderr chunk (state=3): >>><<< 19665 1727204166.84387: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204166.84391: _low_level_execute_command(): starting 19665 1727204166.84395: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204166.7170706-21126-236360075209871/AnsiballZ_stat.py && sleep 0' 19665 1727204166.85487: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204166.85509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204166.85525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204166.85546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204166.85588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204166.85602: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204166.85622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204166.85642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204166.85654: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204166.85667: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204166.85685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204166.85699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204166.85720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204166.85733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204166.85747: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204166.85761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204166.85842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204166.85858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204166.85875: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204166.86008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204166.99045: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29666, "dev": 21, "nlink": 1, "atime": 1727204163.349051, "mtime": 1727204163.349051, "ctime": 1727204163.349051, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 19665 1727204167.00054: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204167.00158: stderr chunk (state=3): >>><<< 19665 1727204167.00161: stdout chunk (state=3): >>><<< 19665 1727204167.00175: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/LSR-TST-br31", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 29666, "dev": 21, "nlink": 1, "atime": 1727204163.349051, "mtime": 1727204163.349051, "ctime": 1727204163.349051, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204167.00218: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204166.7170706-21126-236360075209871/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204167.00226: _low_level_execute_command(): starting 19665 1727204167.00230: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204166.7170706-21126-236360075209871/ > /dev/null 2>&1 && sleep 0' 19665 1727204167.00708: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204167.00711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204167.00746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204167.00752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204167.00762: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204167.00769: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204167.00788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204167.00791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204167.00850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204167.00856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204167.00908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204167.02672: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204167.02749: stderr chunk (state=3): >>><<< 19665 1727204167.02752: stdout chunk (state=3): >>><<< 19665 1727204167.02770: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204167.02780: handler run complete 19665 1727204167.02833: attempt loop complete, returning result 19665 1727204167.02836: _execute() done 19665 1727204167.02841: dumping result to json 19665 1727204167.02843: done dumping result, returning 19665 1727204167.02845: done running TaskExecutor() for managed-node3/TASK: Get stat for interface LSR-TST-br31 [0affcd87-79f5-0dcc-3ea6-000000000235] 19665 1727204167.02867: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000235 19665 1727204167.02994: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000235 19665 1727204167.02997: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204163.349051, "block_size": 4096, "blocks": 0, "ctime": 1727204163.349051, "dev": 21, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 29666, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/LSR-TST-br31", "lnk_target": "../../devices/virtual/net/LSR-TST-br31", "mode": "0777", "mtime": 1727204163.349051, "nlink": 1, "path": "/sys/class/net/LSR-TST-br31", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 19665 1727204167.03104: no more pending results, returning what we have 19665 1727204167.03125: results queue empty 19665 1727204167.03126: checking for any_errors_fatal 19665 1727204167.03128: done checking for any_errors_fatal 19665 1727204167.03129: checking for max_fail_percentage 19665 1727204167.03130: done checking for max_fail_percentage 19665 1727204167.03131: checking to see if all hosts have failed and the running result is not ok 19665 1727204167.03132: done checking to see if all hosts have failed 19665 1727204167.03133: getting the remaining hosts for this loop 19665 1727204167.03134: done getting the remaining hosts for this loop 19665 1727204167.03140: getting the next task for host managed-node3 19665 1727204167.03163: done getting next task for host managed-node3 19665 1727204167.03168: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 19665 1727204167.03171: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204167.03184: getting variables 19665 1727204167.03186: in VariableManager get_vars() 19665 1727204167.03257: Calling all_inventory to load vars for managed-node3 19665 1727204167.03261: Calling groups_inventory to load vars for managed-node3 19665 1727204167.03279: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204167.03298: Calling all_plugins_play to load vars for managed-node3 19665 1727204167.03301: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204167.03305: Calling groups_plugins_play to load vars for managed-node3 19665 1727204167.04493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204167.05537: done with get_vars() 19665 1727204167.05554: done getting variables 19665 1727204167.05599: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204167.05691: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'LSR-TST-br31'] ******************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:56:07 -0400 (0:00:00.444) 0:00:17.923 ***** 19665 1727204167.05715: entering _queue_task() for managed-node3/assert 19665 1727204167.05995: worker is 1 (out of 1 available) 19665 1727204167.06009: exiting _queue_task() for managed-node3/assert 19665 1727204167.06022: done queuing things up, now waiting for results queue to drain 19665 1727204167.06024: waiting for pending results... 19665 1727204167.06285: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'LSR-TST-br31' 19665 1727204167.06368: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000022b 19665 1727204167.06383: variable 'ansible_search_path' from source: unknown 19665 1727204167.06386: variable 'ansible_search_path' from source: unknown 19665 1727204167.06421: calling self._execute() 19665 1727204167.06492: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204167.06496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204167.06504: variable 'omit' from source: magic vars 19665 1727204167.06786: variable 'ansible_distribution_major_version' from source: facts 19665 1727204167.06797: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204167.06802: variable 'omit' from source: magic vars 19665 1727204167.06839: variable 'omit' from source: magic vars 19665 1727204167.06910: variable 'interface' from source: set_fact 19665 1727204167.06923: variable 'omit' from source: magic vars 19665 1727204167.06957: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204167.06983: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204167.07000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204167.07015: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204167.07024: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204167.07049: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204167.07052: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204167.07055: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204167.07151: Set connection var ansible_connection to ssh 19665 1727204167.07154: Set connection var ansible_shell_type to sh 19665 1727204167.07156: Set connection var ansible_timeout to 10 19665 1727204167.07158: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204167.07161: Set connection var ansible_shell_executable to /bin/sh 19665 1727204167.07163: Set connection var ansible_pipelining to False 19665 1727204167.07200: variable 'ansible_shell_executable' from source: unknown 19665 1727204167.07204: variable 'ansible_connection' from source: unknown 19665 1727204167.07207: variable 'ansible_module_compression' from source: unknown 19665 1727204167.07209: variable 'ansible_shell_type' from source: unknown 19665 1727204167.07211: variable 'ansible_shell_executable' from source: unknown 19665 1727204167.07213: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204167.07215: variable 'ansible_pipelining' from source: unknown 19665 1727204167.07218: variable 'ansible_timeout' from source: unknown 19665 1727204167.07220: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204167.07355: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204167.07361: variable 'omit' from source: magic vars 19665 1727204167.07364: starting attempt loop 19665 1727204167.07366: running the handler 19665 1727204167.07478: variable 'interface_stat' from source: set_fact 19665 1727204167.07507: Evaluated conditional (interface_stat.stat.exists): True 19665 1727204167.07518: handler run complete 19665 1727204167.07538: attempt loop complete, returning result 19665 1727204167.07546: _execute() done 19665 1727204167.07556: dumping result to json 19665 1727204167.07576: done dumping result, returning 19665 1727204167.07580: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'LSR-TST-br31' [0affcd87-79f5-0dcc-3ea6-00000000022b] 19665 1727204167.07582: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000022b 19665 1727204167.07659: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000022b 19665 1727204167.07661: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 19665 1727204167.07708: no more pending results, returning what we have 19665 1727204167.07712: results queue empty 19665 1727204167.07713: checking for any_errors_fatal 19665 1727204167.07720: done checking for any_errors_fatal 19665 1727204167.07721: checking for max_fail_percentage 19665 1727204167.07722: done checking for max_fail_percentage 19665 1727204167.07723: checking to see if all hosts have failed and the running result is not ok 19665 1727204167.07724: done checking to see if all hosts have failed 19665 1727204167.07725: getting the remaining hosts for this loop 19665 1727204167.07726: done getting the remaining hosts for this loop 19665 1727204167.07730: getting the next task for host managed-node3 19665 1727204167.07738: done getting next task for host managed-node3 19665 1727204167.07741: ^ task is: TASK: meta (flush_handlers) 19665 1727204167.07743: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204167.07746: getting variables 19665 1727204167.07747: in VariableManager get_vars() 19665 1727204167.07780: Calling all_inventory to load vars for managed-node3 19665 1727204167.07782: Calling groups_inventory to load vars for managed-node3 19665 1727204167.07786: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204167.07797: Calling all_plugins_play to load vars for managed-node3 19665 1727204167.07801: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204167.07803: Calling groups_plugins_play to load vars for managed-node3 19665 1727204167.09092: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204167.10224: done with get_vars() 19665 1727204167.10251: done getting variables 19665 1727204167.10308: in VariableManager get_vars() 19665 1727204167.10315: Calling all_inventory to load vars for managed-node3 19665 1727204167.10317: Calling groups_inventory to load vars for managed-node3 19665 1727204167.10318: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204167.10322: Calling all_plugins_play to load vars for managed-node3 19665 1727204167.10323: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204167.10325: Calling groups_plugins_play to load vars for managed-node3 19665 1727204167.11369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204167.13056: done with get_vars() 19665 1727204167.13090: done queuing things up, now waiting for results queue to drain 19665 1727204167.13093: results queue empty 19665 1727204167.13094: checking for any_errors_fatal 19665 1727204167.13097: done checking for any_errors_fatal 19665 1727204167.13098: checking for max_fail_percentage 19665 1727204167.13099: done checking for max_fail_percentage 19665 1727204167.13099: checking to see if all hosts have failed and the running result is not ok 19665 1727204167.13100: done checking to see if all hosts have failed 19665 1727204167.13106: getting the remaining hosts for this loop 19665 1727204167.13107: done getting the remaining hosts for this loop 19665 1727204167.13110: getting the next task for host managed-node3 19665 1727204167.13115: done getting next task for host managed-node3 19665 1727204167.13116: ^ task is: TASK: meta (flush_handlers) 19665 1727204167.13118: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204167.13121: getting variables 19665 1727204167.13122: in VariableManager get_vars() 19665 1727204167.13132: Calling all_inventory to load vars for managed-node3 19665 1727204167.13134: Calling groups_inventory to load vars for managed-node3 19665 1727204167.13137: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204167.13143: Calling all_plugins_play to load vars for managed-node3 19665 1727204167.13145: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204167.13148: Calling groups_plugins_play to load vars for managed-node3 19665 1727204167.14098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204167.17454: done with get_vars() 19665 1727204167.17600: done getting variables 19665 1727204167.17660: in VariableManager get_vars() 19665 1727204167.17674: Calling all_inventory to load vars for managed-node3 19665 1727204167.17677: Calling groups_inventory to load vars for managed-node3 19665 1727204167.17680: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204167.17685: Calling all_plugins_play to load vars for managed-node3 19665 1727204167.17687: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204167.17690: Calling groups_plugins_play to load vars for managed-node3 19665 1727204167.19481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204167.23520: done with get_vars() 19665 1727204167.23549: done queuing things up, now waiting for results queue to drain 19665 1727204167.23552: results queue empty 19665 1727204167.23552: checking for any_errors_fatal 19665 1727204167.23554: done checking for any_errors_fatal 19665 1727204167.23555: checking for max_fail_percentage 19665 1727204167.23556: done checking for max_fail_percentage 19665 1727204167.23557: checking to see if all hosts have failed and the running result is not ok 19665 1727204167.23557: done checking to see if all hosts have failed 19665 1727204167.23558: getting the remaining hosts for this loop 19665 1727204167.23559: done getting the remaining hosts for this loop 19665 1727204167.23562: getting the next task for host managed-node3 19665 1727204167.23568: done getting next task for host managed-node3 19665 1727204167.23569: ^ task is: None 19665 1727204167.23571: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204167.23572: done queuing things up, now waiting for results queue to drain 19665 1727204167.23573: results queue empty 19665 1727204167.23573: checking for any_errors_fatal 19665 1727204167.23574: done checking for any_errors_fatal 19665 1727204167.23575: checking for max_fail_percentage 19665 1727204167.23576: done checking for max_fail_percentage 19665 1727204167.23576: checking to see if all hosts have failed and the running result is not ok 19665 1727204167.23577: done checking to see if all hosts have failed 19665 1727204167.23578: getting the next task for host managed-node3 19665 1727204167.23581: done getting next task for host managed-node3 19665 1727204167.23582: ^ task is: None 19665 1727204167.23583: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204167.23623: in VariableManager get_vars() 19665 1727204167.23641: done with get_vars() 19665 1727204167.23647: in VariableManager get_vars() 19665 1727204167.23656: done with get_vars() 19665 1727204167.23660: variable 'omit' from source: magic vars 19665 1727204167.23775: variable 'task' from source: play vars 19665 1727204167.23806: in VariableManager get_vars() 19665 1727204167.23817: done with get_vars() 19665 1727204167.23837: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_present.yml] *********************** 19665 1727204167.24012: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19665 1727204167.24034: getting the remaining hosts for this loop 19665 1727204167.24036: done getting the remaining hosts for this loop 19665 1727204167.24038: getting the next task for host managed-node3 19665 1727204167.24041: done getting next task for host managed-node3 19665 1727204167.24043: ^ task is: TASK: Gathering Facts 19665 1727204167.24045: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204167.24046: getting variables 19665 1727204167.24047: in VariableManager get_vars() 19665 1727204167.24056: Calling all_inventory to load vars for managed-node3 19665 1727204167.24058: Calling groups_inventory to load vars for managed-node3 19665 1727204167.24060: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204167.24068: Calling all_plugins_play to load vars for managed-node3 19665 1727204167.24071: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204167.24074: Calling groups_plugins_play to load vars for managed-node3 19665 1727204167.25286: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204167.26955: done with get_vars() 19665 1727204167.26977: done getting variables 19665 1727204167.27021: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:56:07 -0400 (0:00:00.213) 0:00:18.137 ***** 19665 1727204167.27047: entering _queue_task() for managed-node3/gather_facts 19665 1727204167.27362: worker is 1 (out of 1 available) 19665 1727204167.27377: exiting _queue_task() for managed-node3/gather_facts 19665 1727204167.27387: done queuing things up, now waiting for results queue to drain 19665 1727204167.27389: waiting for pending results... 19665 1727204167.28251: running TaskExecutor() for managed-node3/TASK: Gathering Facts 19665 1727204167.28362: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000024e 19665 1727204167.28397: variable 'ansible_search_path' from source: unknown 19665 1727204167.28439: calling self._execute() 19665 1727204167.28539: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204167.28598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204167.28612: variable 'omit' from source: magic vars 19665 1727204167.29110: variable 'ansible_distribution_major_version' from source: facts 19665 1727204167.29133: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204167.29144: variable 'omit' from source: magic vars 19665 1727204167.29184: variable 'omit' from source: magic vars 19665 1727204167.29227: variable 'omit' from source: magic vars 19665 1727204167.29279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204167.29319: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204167.29349: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204167.29374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204167.29390: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204167.29426: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204167.29434: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204167.29442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204167.29546: Set connection var ansible_connection to ssh 19665 1727204167.29559: Set connection var ansible_shell_type to sh 19665 1727204167.29574: Set connection var ansible_timeout to 10 19665 1727204167.29583: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204167.29594: Set connection var ansible_shell_executable to /bin/sh 19665 1727204167.29605: Set connection var ansible_pipelining to False 19665 1727204167.29632: variable 'ansible_shell_executable' from source: unknown 19665 1727204167.29640: variable 'ansible_connection' from source: unknown 19665 1727204167.29646: variable 'ansible_module_compression' from source: unknown 19665 1727204167.29652: variable 'ansible_shell_type' from source: unknown 19665 1727204167.29658: variable 'ansible_shell_executable' from source: unknown 19665 1727204167.29667: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204167.29678: variable 'ansible_pipelining' from source: unknown 19665 1727204167.29684: variable 'ansible_timeout' from source: unknown 19665 1727204167.29692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204167.29873: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204167.29890: variable 'omit' from source: magic vars 19665 1727204167.29898: starting attempt loop 19665 1727204167.29903: running the handler 19665 1727204167.29920: variable 'ansible_facts' from source: unknown 19665 1727204167.29943: _low_level_execute_command(): starting 19665 1727204167.29955: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204167.31058: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204167.31211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204167.31229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204167.31249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204167.31294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204167.31310: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204167.31325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204167.31345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204167.31358: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204167.31373: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204167.31386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204167.31400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204167.31419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204167.31433: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204167.31445: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204167.31460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204167.31652: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204167.31678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204167.31695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204167.31776: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204167.33403: stdout chunk (state=3): >>>/root <<< 19665 1727204167.33606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204167.33609: stdout chunk (state=3): >>><<< 19665 1727204167.33612: stderr chunk (state=3): >>><<< 19665 1727204167.33735: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204167.33739: _low_level_execute_command(): starting 19665 1727204167.33742: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204167.3363502-21233-121629717571272 `" && echo ansible-tmp-1727204167.3363502-21233-121629717571272="` echo /root/.ansible/tmp/ansible-tmp-1727204167.3363502-21233-121629717571272 `" ) && sleep 0' 19665 1727204167.34363: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204167.34389: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204167.34408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204167.34428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204167.34473: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204167.34486: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204167.34512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204167.34530: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204167.34546: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204167.34558: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204167.34573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204167.34586: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204167.34610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204167.34628: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204167.34641: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204167.34656: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204167.34742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204167.34768: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204167.34785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204167.34862: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204167.36832: stdout chunk (state=3): >>>ansible-tmp-1727204167.3363502-21233-121629717571272=/root/.ansible/tmp/ansible-tmp-1727204167.3363502-21233-121629717571272 <<< 19665 1727204167.36941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204167.37038: stderr chunk (state=3): >>><<< 19665 1727204167.37052: stdout chunk (state=3): >>><<< 19665 1727204167.37371: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204167.3363502-21233-121629717571272=/root/.ansible/tmp/ansible-tmp-1727204167.3363502-21233-121629717571272 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204167.37375: variable 'ansible_module_compression' from source: unknown 19665 1727204167.37377: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19665 1727204167.37379: variable 'ansible_facts' from source: unknown 19665 1727204167.37413: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204167.3363502-21233-121629717571272/AnsiballZ_setup.py 19665 1727204167.37711: Sending initial data 19665 1727204167.37715: Sent initial data (154 bytes) 19665 1727204167.38808: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204167.38830: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204167.38853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204167.38877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204167.38920: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204167.38941: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204167.38961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204167.38982: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204167.38995: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204167.39007: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204167.39020: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204167.39034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204167.39062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204167.39079: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204167.39091: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204167.39105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204167.39192: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204167.39209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204167.39224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204167.39307: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204167.41057: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204167.41105: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204167.41149: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpz359lwur /root/.ansible/tmp/ansible-tmp-1727204167.3363502-21233-121629717571272/AnsiballZ_setup.py <<< 19665 1727204167.41200: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204167.43747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204167.43882: stderr chunk (state=3): >>><<< 19665 1727204167.43885: stdout chunk (state=3): >>><<< 19665 1727204167.43888: done transferring module to remote 19665 1727204167.43894: _low_level_execute_command(): starting 19665 1727204167.43896: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204167.3363502-21233-121629717571272/ /root/.ansible/tmp/ansible-tmp-1727204167.3363502-21233-121629717571272/AnsiballZ_setup.py && sleep 0' 19665 1727204167.44521: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204167.44542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204167.44558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204167.44581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204167.44625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204167.44644: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204167.44660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204167.44681: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204167.44694: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204167.44705: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204167.44723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204167.44740: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204167.44760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204167.44775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204167.44788: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204167.44802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204167.44885: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204167.44902: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204167.44917: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204167.44995: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204167.46739: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204167.46790: stderr chunk (state=3): >>><<< 19665 1727204167.46794: stdout chunk (state=3): >>><<< 19665 1727204167.46806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204167.46810: _low_level_execute_command(): starting 19665 1727204167.46813: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204167.3363502-21233-121629717571272/AnsiballZ_setup.py && sleep 0' 19665 1727204167.47310: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204167.47314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204167.47344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204167.47370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 19665 1727204167.47373: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204167.47376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204167.47442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204167.47464: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204167.47563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204167.98539: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distrib<<< 19665 1727204167.98565: stdout chunk (state=3): >>>ution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "07", "epoch": "1727204167", "epoch_int": "1727204167", "date": "2024-09-24", "time": "14:56:07", "iso8601_micro": "2024-09-24T18:56:07.717590Z", "iso8601": "2024-09-24T18:56:07Z", "iso8601_basic": "20240924T145607717590", "iso8601_basic_short": "20240924T145607", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "ae:40:77:00:3f:d3", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust"<<< 19665 1727204167.98575: stdout chunk (state=3): >>>: "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_loadavg": {"1m": 0.31, "5m": 0.33, "15m": 0.16}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2816, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 716, "free": 2816}, "nocache": {"free": 3275, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 513, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282099712, "block_size": 4096, "block_total": 65519355, "block_available": 64521997, "block_used": 997358, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19665 1727204168.00212: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204168.00275: stderr chunk (state=3): >>><<< 19665 1727204168.00278: stdout chunk (state=3): >>><<< 19665 1727204168.00318: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_apparmor": {"status": "disabled"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "07", "epoch": "1727204167", "epoch_int": "1727204167", "date": "2024-09-24", "time": "14:56:07", "iso8601_micro": "2024-09-24T18:56:07.717590Z", "iso8601": "2024-09-24T18:56:07Z", "iso8601_basic": "20240924T145607717590", "iso8601_basic_short": "20240924T145607", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0", "LSR-TST-br31"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "ae:40:77:00:3f:d3", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_loadavg": {"1m": 0.31, "5m": 0.33, "15m": 0.16}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2816, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 716, "free": 2816}, "nocache": {"free": 3275, "used": 257}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 513, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282099712, "block_size": 4096, "block_total": 65519355, "block_available": 64521997, "block_used": 997358, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204168.00565: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204167.3363502-21233-121629717571272/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204168.00584: _low_level_execute_command(): starting 19665 1727204168.00587: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204167.3363502-21233-121629717571272/ > /dev/null 2>&1 && sleep 0' 19665 1727204168.01072: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204168.01085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.01103: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.01115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204168.01124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.01176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204168.01188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204168.01238: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204168.03010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204168.03073: stderr chunk (state=3): >>><<< 19665 1727204168.03076: stdout chunk (state=3): >>><<< 19665 1727204168.03096: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204168.03102: handler run complete 19665 1727204168.03194: variable 'ansible_facts' from source: unknown 19665 1727204168.03267: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204168.03468: variable 'ansible_facts' from source: unknown 19665 1727204168.03528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204168.03617: attempt loop complete, returning result 19665 1727204168.03621: _execute() done 19665 1727204168.03623: dumping result to json 19665 1727204168.03650: done dumping result, returning 19665 1727204168.03656: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-0dcc-3ea6-00000000024e] 19665 1727204168.03663: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000024e 19665 1727204168.03973: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000024e ok: [managed-node3] 19665 1727204168.04197: no more pending results, returning what we have 19665 1727204168.04199: results queue empty 19665 1727204168.04200: checking for any_errors_fatal 19665 1727204168.04201: done checking for any_errors_fatal 19665 1727204168.04201: checking for max_fail_percentage 19665 1727204168.04203: done checking for max_fail_percentage 19665 1727204168.04203: checking to see if all hosts have failed and the running result is not ok 19665 1727204168.04204: done checking to see if all hosts have failed 19665 1727204168.04204: getting the remaining hosts for this loop 19665 1727204168.04205: done getting the remaining hosts for this loop 19665 1727204168.04208: getting the next task for host managed-node3 19665 1727204168.04212: done getting next task for host managed-node3 19665 1727204168.04213: ^ task is: TASK: meta (flush_handlers) 19665 1727204168.04214: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204168.04217: getting variables 19665 1727204168.04218: in VariableManager get_vars() 19665 1727204168.04235: Calling all_inventory to load vars for managed-node3 19665 1727204168.04238: Calling groups_inventory to load vars for managed-node3 19665 1727204168.04241: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204168.04251: Calling all_plugins_play to load vars for managed-node3 19665 1727204168.04252: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204168.04255: Calling groups_plugins_play to load vars for managed-node3 19665 1727204168.04772: WORKER PROCESS EXITING 19665 1727204168.05137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204168.06071: done with get_vars() 19665 1727204168.06089: done getting variables 19665 1727204168.06141: in VariableManager get_vars() 19665 1727204168.06148: Calling all_inventory to load vars for managed-node3 19665 1727204168.06150: Calling groups_inventory to load vars for managed-node3 19665 1727204168.06152: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204168.06155: Calling all_plugins_play to load vars for managed-node3 19665 1727204168.06157: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204168.06158: Calling groups_plugins_play to load vars for managed-node3 19665 1727204168.06828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204168.07832: done with get_vars() 19665 1727204168.07853: done queuing things up, now waiting for results queue to drain 19665 1727204168.07854: results queue empty 19665 1727204168.07855: checking for any_errors_fatal 19665 1727204168.07858: done checking for any_errors_fatal 19665 1727204168.07858: checking for max_fail_percentage 19665 1727204168.07859: done checking for max_fail_percentage 19665 1727204168.07859: checking to see if all hosts have failed and the running result is not ok 19665 1727204168.07865: done checking to see if all hosts have failed 19665 1727204168.07866: getting the remaining hosts for this loop 19665 1727204168.07867: done getting the remaining hosts for this loop 19665 1727204168.07869: getting the next task for host managed-node3 19665 1727204168.07871: done getting next task for host managed-node3 19665 1727204168.07873: ^ task is: TASK: Include the task '{{ task }}' 19665 1727204168.07874: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204168.07876: getting variables 19665 1727204168.07877: in VariableManager get_vars() 19665 1727204168.07883: Calling all_inventory to load vars for managed-node3 19665 1727204168.07884: Calling groups_inventory to load vars for managed-node3 19665 1727204168.07886: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204168.07890: Calling all_plugins_play to load vars for managed-node3 19665 1727204168.07891: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204168.07893: Calling groups_plugins_play to load vars for managed-node3 19665 1727204168.08563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204168.09461: done with get_vars() 19665 1727204168.09479: done getting variables 19665 1727204168.09605: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_present.yml'] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:56:08 -0400 (0:00:00.825) 0:00:18.962 ***** 19665 1727204168.09627: entering _queue_task() for managed-node3/include_tasks 19665 1727204168.09879: worker is 1 (out of 1 available) 19665 1727204168.09893: exiting _queue_task() for managed-node3/include_tasks 19665 1727204168.09907: done queuing things up, now waiting for results queue to drain 19665 1727204168.09908: waiting for pending results... 19665 1727204168.10090: running TaskExecutor() for managed-node3/TASK: Include the task 'tasks/assert_profile_present.yml' 19665 1727204168.10151: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000031 19665 1727204168.10163: variable 'ansible_search_path' from source: unknown 19665 1727204168.10193: calling self._execute() 19665 1727204168.10268: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204168.10273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204168.10282: variable 'omit' from source: magic vars 19665 1727204168.10572: variable 'ansible_distribution_major_version' from source: facts 19665 1727204168.10583: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204168.10589: variable 'task' from source: play vars 19665 1727204168.10639: variable 'task' from source: play vars 19665 1727204168.10646: _execute() done 19665 1727204168.10650: dumping result to json 19665 1727204168.10653: done dumping result, returning 19665 1727204168.10659: done running TaskExecutor() for managed-node3/TASK: Include the task 'tasks/assert_profile_present.yml' [0affcd87-79f5-0dcc-3ea6-000000000031] 19665 1727204168.10672: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000031 19665 1727204168.10758: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000031 19665 1727204168.10761: WORKER PROCESS EXITING 19665 1727204168.10802: no more pending results, returning what we have 19665 1727204168.10807: in VariableManager get_vars() 19665 1727204168.10841: Calling all_inventory to load vars for managed-node3 19665 1727204168.10843: Calling groups_inventory to load vars for managed-node3 19665 1727204168.10846: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204168.10865: Calling all_plugins_play to load vars for managed-node3 19665 1727204168.10875: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204168.10879: Calling groups_plugins_play to load vars for managed-node3 19665 1727204168.11799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204168.12713: done with get_vars() 19665 1727204168.12728: variable 'ansible_search_path' from source: unknown 19665 1727204168.12740: we have included files to process 19665 1727204168.12741: generating all_blocks data 19665 1727204168.12742: done generating all_blocks data 19665 1727204168.12743: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 19665 1727204168.12744: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 19665 1727204168.12745: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 19665 1727204168.12885: in VariableManager get_vars() 19665 1727204168.12897: done with get_vars() 19665 1727204168.13076: done processing included file 19665 1727204168.13078: iterating over new_blocks loaded from include file 19665 1727204168.13079: in VariableManager get_vars() 19665 1727204168.13086: done with get_vars() 19665 1727204168.13087: filtering new block on tags 19665 1727204168.13100: done filtering new block on tags 19665 1727204168.13102: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 19665 1727204168.13106: extending task lists for all hosts with included blocks 19665 1727204168.13124: done extending task lists 19665 1727204168.13125: done processing included files 19665 1727204168.13125: results queue empty 19665 1727204168.13126: checking for any_errors_fatal 19665 1727204168.13127: done checking for any_errors_fatal 19665 1727204168.13127: checking for max_fail_percentage 19665 1727204168.13128: done checking for max_fail_percentage 19665 1727204168.13128: checking to see if all hosts have failed and the running result is not ok 19665 1727204168.13129: done checking to see if all hosts have failed 19665 1727204168.13129: getting the remaining hosts for this loop 19665 1727204168.13130: done getting the remaining hosts for this loop 19665 1727204168.13132: getting the next task for host managed-node3 19665 1727204168.13135: done getting next task for host managed-node3 19665 1727204168.13137: ^ task is: TASK: Include the task 'get_profile_stat.yml' 19665 1727204168.13139: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204168.13141: getting variables 19665 1727204168.13141: in VariableManager get_vars() 19665 1727204168.13148: Calling all_inventory to load vars for managed-node3 19665 1727204168.13150: Calling groups_inventory to load vars for managed-node3 19665 1727204168.13152: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204168.13156: Calling all_plugins_play to load vars for managed-node3 19665 1727204168.13157: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204168.13159: Calling groups_plugins_play to load vars for managed-node3 19665 1727204168.13853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204168.14748: done with get_vars() 19665 1727204168.14765: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:56:08 -0400 (0:00:00.051) 0:00:19.014 ***** 19665 1727204168.14815: entering _queue_task() for managed-node3/include_tasks 19665 1727204168.15058: worker is 1 (out of 1 available) 19665 1727204168.15072: exiting _queue_task() for managed-node3/include_tasks 19665 1727204168.15085: done queuing things up, now waiting for results queue to drain 19665 1727204168.15087: waiting for pending results... 19665 1727204168.15262: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 19665 1727204168.15329: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000025f 19665 1727204168.15339: variable 'ansible_search_path' from source: unknown 19665 1727204168.15344: variable 'ansible_search_path' from source: unknown 19665 1727204168.15374: calling self._execute() 19665 1727204168.15439: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204168.15446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204168.15455: variable 'omit' from source: magic vars 19665 1727204168.15738: variable 'ansible_distribution_major_version' from source: facts 19665 1727204168.15752: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204168.15761: _execute() done 19665 1727204168.15767: dumping result to json 19665 1727204168.15770: done dumping result, returning 19665 1727204168.15776: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-0dcc-3ea6-00000000025f] 19665 1727204168.15782: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000025f 19665 1727204168.15858: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000025f 19665 1727204168.15862: WORKER PROCESS EXITING 19665 1727204168.15887: no more pending results, returning what we have 19665 1727204168.15891: in VariableManager get_vars() 19665 1727204168.15921: Calling all_inventory to load vars for managed-node3 19665 1727204168.15923: Calling groups_inventory to load vars for managed-node3 19665 1727204168.15927: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204168.15942: Calling all_plugins_play to load vars for managed-node3 19665 1727204168.15946: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204168.15948: Calling groups_plugins_play to load vars for managed-node3 19665 1727204168.16796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204168.17711: done with get_vars() 19665 1727204168.17724: variable 'ansible_search_path' from source: unknown 19665 1727204168.17725: variable 'ansible_search_path' from source: unknown 19665 1727204168.17731: variable 'task' from source: play vars 19665 1727204168.17811: variable 'task' from source: play vars 19665 1727204168.17835: we have included files to process 19665 1727204168.17836: generating all_blocks data 19665 1727204168.17839: done generating all_blocks data 19665 1727204168.17840: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19665 1727204168.17841: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19665 1727204168.17842: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19665 1727204168.18539: done processing included file 19665 1727204168.18541: iterating over new_blocks loaded from include file 19665 1727204168.18542: in VariableManager get_vars() 19665 1727204168.18550: done with get_vars() 19665 1727204168.18551: filtering new block on tags 19665 1727204168.18569: done filtering new block on tags 19665 1727204168.18571: in VariableManager get_vars() 19665 1727204168.18580: done with get_vars() 19665 1727204168.18581: filtering new block on tags 19665 1727204168.18593: done filtering new block on tags 19665 1727204168.18594: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 19665 1727204168.18598: extending task lists for all hosts with included blocks 19665 1727204168.18694: done extending task lists 19665 1727204168.18695: done processing included files 19665 1727204168.18696: results queue empty 19665 1727204168.18697: checking for any_errors_fatal 19665 1727204168.18699: done checking for any_errors_fatal 19665 1727204168.18700: checking for max_fail_percentage 19665 1727204168.18700: done checking for max_fail_percentage 19665 1727204168.18701: checking to see if all hosts have failed and the running result is not ok 19665 1727204168.18701: done checking to see if all hosts have failed 19665 1727204168.18702: getting the remaining hosts for this loop 19665 1727204168.18703: done getting the remaining hosts for this loop 19665 1727204168.18704: getting the next task for host managed-node3 19665 1727204168.18707: done getting next task for host managed-node3 19665 1727204168.18708: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 19665 1727204168.18711: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204168.18712: getting variables 19665 1727204168.18713: in VariableManager get_vars() 19665 1727204168.18751: Calling all_inventory to load vars for managed-node3 19665 1727204168.18753: Calling groups_inventory to load vars for managed-node3 19665 1727204168.18754: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204168.18758: Calling all_plugins_play to load vars for managed-node3 19665 1727204168.18759: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204168.18761: Calling groups_plugins_play to load vars for managed-node3 19665 1727204168.19456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204168.23329: done with get_vars() 19665 1727204168.23355: done getting variables 19665 1727204168.23386: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:56:08 -0400 (0:00:00.085) 0:00:19.100 ***** 19665 1727204168.23404: entering _queue_task() for managed-node3/set_fact 19665 1727204168.23728: worker is 1 (out of 1 available) 19665 1727204168.23746: exiting _queue_task() for managed-node3/set_fact 19665 1727204168.23763: done queuing things up, now waiting for results queue to drain 19665 1727204168.23769: waiting for pending results... 19665 1727204168.24135: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 19665 1727204168.24311: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000026c 19665 1727204168.24341: variable 'ansible_search_path' from source: unknown 19665 1727204168.24354: variable 'ansible_search_path' from source: unknown 19665 1727204168.24402: calling self._execute() 19665 1727204168.24513: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204168.24528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204168.24555: variable 'omit' from source: magic vars 19665 1727204168.24868: variable 'ansible_distribution_major_version' from source: facts 19665 1727204168.24894: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204168.24909: variable 'omit' from source: magic vars 19665 1727204168.24955: variable 'omit' from source: magic vars 19665 1727204168.24987: variable 'omit' from source: magic vars 19665 1727204168.25025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204168.25053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204168.25073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204168.25088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204168.25097: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204168.25125: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204168.25128: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204168.25131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204168.25253: Set connection var ansible_connection to ssh 19665 1727204168.25271: Set connection var ansible_shell_type to sh 19665 1727204168.25292: Set connection var ansible_timeout to 10 19665 1727204168.25308: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204168.25324: Set connection var ansible_shell_executable to /bin/sh 19665 1727204168.25351: Set connection var ansible_pipelining to False 19665 1727204168.25386: variable 'ansible_shell_executable' from source: unknown 19665 1727204168.25396: variable 'ansible_connection' from source: unknown 19665 1727204168.25407: variable 'ansible_module_compression' from source: unknown 19665 1727204168.25418: variable 'ansible_shell_type' from source: unknown 19665 1727204168.25429: variable 'ansible_shell_executable' from source: unknown 19665 1727204168.25443: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204168.25450: variable 'ansible_pipelining' from source: unknown 19665 1727204168.25457: variable 'ansible_timeout' from source: unknown 19665 1727204168.25461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204168.25566: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204168.25576: variable 'omit' from source: magic vars 19665 1727204168.25581: starting attempt loop 19665 1727204168.25584: running the handler 19665 1727204168.25608: handler run complete 19665 1727204168.25660: attempt loop complete, returning result 19665 1727204168.25688: _execute() done 19665 1727204168.25701: dumping result to json 19665 1727204168.25710: done dumping result, returning 19665 1727204168.25721: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-0dcc-3ea6-00000000026c] 19665 1727204168.25731: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000026c 19665 1727204168.25822: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000026c 19665 1727204168.25824: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 19665 1727204168.25876: no more pending results, returning what we have 19665 1727204168.25879: results queue empty 19665 1727204168.25881: checking for any_errors_fatal 19665 1727204168.25882: done checking for any_errors_fatal 19665 1727204168.25883: checking for max_fail_percentage 19665 1727204168.25884: done checking for max_fail_percentage 19665 1727204168.25885: checking to see if all hosts have failed and the running result is not ok 19665 1727204168.25886: done checking to see if all hosts have failed 19665 1727204168.25887: getting the remaining hosts for this loop 19665 1727204168.25888: done getting the remaining hosts for this loop 19665 1727204168.25893: getting the next task for host managed-node3 19665 1727204168.25901: done getting next task for host managed-node3 19665 1727204168.25903: ^ task is: TASK: Stat profile file 19665 1727204168.25907: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204168.25910: getting variables 19665 1727204168.25912: in VariableManager get_vars() 19665 1727204168.25941: Calling all_inventory to load vars for managed-node3 19665 1727204168.25943: Calling groups_inventory to load vars for managed-node3 19665 1727204168.25947: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204168.25958: Calling all_plugins_play to load vars for managed-node3 19665 1727204168.25960: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204168.25963: Calling groups_plugins_play to load vars for managed-node3 19665 1727204168.28052: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204168.29010: done with get_vars() 19665 1727204168.29025: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:56:08 -0400 (0:00:00.056) 0:00:19.157 ***** 19665 1727204168.29097: entering _queue_task() for managed-node3/stat 19665 1727204168.29316: worker is 1 (out of 1 available) 19665 1727204168.29330: exiting _queue_task() for managed-node3/stat 19665 1727204168.29343: done queuing things up, now waiting for results queue to drain 19665 1727204168.29345: waiting for pending results... 19665 1727204168.29522: running TaskExecutor() for managed-node3/TASK: Stat profile file 19665 1727204168.29604: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000026d 19665 1727204168.29615: variable 'ansible_search_path' from source: unknown 19665 1727204168.29620: variable 'ansible_search_path' from source: unknown 19665 1727204168.29652: calling self._execute() 19665 1727204168.29727: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204168.29731: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204168.29742: variable 'omit' from source: magic vars 19665 1727204168.30026: variable 'ansible_distribution_major_version' from source: facts 19665 1727204168.30037: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204168.30044: variable 'omit' from source: magic vars 19665 1727204168.30079: variable 'omit' from source: magic vars 19665 1727204168.30154: variable 'profile' from source: play vars 19665 1727204168.30165: variable 'interface' from source: set_fact 19665 1727204168.30212: variable 'interface' from source: set_fact 19665 1727204168.30229: variable 'omit' from source: magic vars 19665 1727204168.30264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204168.30294: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204168.30311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204168.30323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204168.30334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204168.30359: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204168.30362: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204168.30366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204168.30430: Set connection var ansible_connection to ssh 19665 1727204168.30437: Set connection var ansible_shell_type to sh 19665 1727204168.30443: Set connection var ansible_timeout to 10 19665 1727204168.30448: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204168.30456: Set connection var ansible_shell_executable to /bin/sh 19665 1727204168.30463: Set connection var ansible_pipelining to False 19665 1727204168.30482: variable 'ansible_shell_executable' from source: unknown 19665 1727204168.30485: variable 'ansible_connection' from source: unknown 19665 1727204168.30493: variable 'ansible_module_compression' from source: unknown 19665 1727204168.30496: variable 'ansible_shell_type' from source: unknown 19665 1727204168.30499: variable 'ansible_shell_executable' from source: unknown 19665 1727204168.30501: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204168.30503: variable 'ansible_pipelining' from source: unknown 19665 1727204168.30506: variable 'ansible_timeout' from source: unknown 19665 1727204168.30510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204168.30752: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204168.30768: variable 'omit' from source: magic vars 19665 1727204168.30779: starting attempt loop 19665 1727204168.30787: running the handler 19665 1727204168.30801: _low_level_execute_command(): starting 19665 1727204168.30811: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204168.31662: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204168.31680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204168.31694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.31722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.31770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204168.31788: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204168.31801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.31826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204168.31845: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204168.31857: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204168.31875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204168.31893: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.31908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.31925: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204168.31940: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204168.31958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.32051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204168.32073: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204168.32087: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204168.32175: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204168.33794: stdout chunk (state=3): >>>/root <<< 19665 1727204168.33984: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204168.33988: stdout chunk (state=3): >>><<< 19665 1727204168.33990: stderr chunk (state=3): >>><<< 19665 1727204168.34076: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204168.34079: _low_level_execute_command(): starting 19665 1727204168.34083: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204168.3401034-21261-56987375369983 `" && echo ansible-tmp-1727204168.3401034-21261-56987375369983="` echo /root/.ansible/tmp/ansible-tmp-1727204168.3401034-21261-56987375369983 `" ) && sleep 0' 19665 1727204168.35363: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204168.35537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204168.35555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.35576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.35618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204168.35631: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204168.35649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.35673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204168.35685: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204168.35695: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204168.35708: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204168.35720: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.35734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.35748: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204168.35758: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204168.35772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.35850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204168.35875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204168.35891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204168.35969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204168.37805: stdout chunk (state=3): >>>ansible-tmp-1727204168.3401034-21261-56987375369983=/root/.ansible/tmp/ansible-tmp-1727204168.3401034-21261-56987375369983 <<< 19665 1727204168.38070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204168.38074: stdout chunk (state=3): >>><<< 19665 1727204168.38077: stderr chunk (state=3): >>><<< 19665 1727204168.38081: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204168.3401034-21261-56987375369983=/root/.ansible/tmp/ansible-tmp-1727204168.3401034-21261-56987375369983 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204168.38083: variable 'ansible_module_compression' from source: unknown 19665 1727204168.38171: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 19665 1727204168.38273: variable 'ansible_facts' from source: unknown 19665 1727204168.38291: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204168.3401034-21261-56987375369983/AnsiballZ_stat.py 19665 1727204168.38423: Sending initial data 19665 1727204168.38433: Sent initial data (152 bytes) 19665 1727204168.39100: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.39103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.39135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.39141: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.39144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.39195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204168.39199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204168.39245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204168.40996: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204168.41045: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 19665 1727204168.41058: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 19665 1727204168.41072: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 19665 1727204168.41130: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpo85s9wo6 /root/.ansible/tmp/ansible-tmp-1727204168.3401034-21261-56987375369983/AnsiballZ_stat.py <<< 19665 1727204168.41191: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204168.42187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204168.42348: stderr chunk (state=3): >>><<< 19665 1727204168.42352: stdout chunk (state=3): >>><<< 19665 1727204168.42354: done transferring module to remote 19665 1727204168.42356: _low_level_execute_command(): starting 19665 1727204168.42359: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204168.3401034-21261-56987375369983/ /root/.ansible/tmp/ansible-tmp-1727204168.3401034-21261-56987375369983/AnsiballZ_stat.py && sleep 0' 19665 1727204168.42756: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204168.42759: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.42785: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.42806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.42809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.42866: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204168.42873: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204168.42916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204168.44670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204168.44711: stderr chunk (state=3): >>><<< 19665 1727204168.44716: stdout chunk (state=3): >>><<< 19665 1727204168.44732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204168.44735: _low_level_execute_command(): starting 19665 1727204168.44741: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204168.3401034-21261-56987375369983/AnsiballZ_stat.py && sleep 0' 19665 1727204168.45428: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204168.45431: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.45489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.45492: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 19665 1727204168.45494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.45496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204168.45499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.45571: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204168.45574: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204168.45637: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204168.58970: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 19665 1727204168.60053: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204168.60140: stderr chunk (state=3): >>><<< 19665 1727204168.60144: stdout chunk (state=3): >>><<< 19665 1727204168.60284: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204168.60288: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204168.3401034-21261-56987375369983/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204168.60292: _low_level_execute_command(): starting 19665 1727204168.60294: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204168.3401034-21261-56987375369983/ > /dev/null 2>&1 && sleep 0' 19665 1727204168.60895: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204168.60909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204168.60926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.60945: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.60996: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204168.61010: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204168.61025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.61043: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204168.61057: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204168.61071: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204168.61085: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204168.61100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.61116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.61129: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204168.61141: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204168.61156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.61235: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204168.61253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204168.61271: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204168.61353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204168.63189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204168.63302: stderr chunk (state=3): >>><<< 19665 1727204168.63314: stdout chunk (state=3): >>><<< 19665 1727204168.63578: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204168.63582: handler run complete 19665 1727204168.63584: attempt loop complete, returning result 19665 1727204168.63586: _execute() done 19665 1727204168.63588: dumping result to json 19665 1727204168.63590: done dumping result, returning 19665 1727204168.63592: done running TaskExecutor() for managed-node3/TASK: Stat profile file [0affcd87-79f5-0dcc-3ea6-00000000026d] 19665 1727204168.63594: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000026d 19665 1727204168.63666: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000026d 19665 1727204168.63669: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 19665 1727204168.63741: no more pending results, returning what we have 19665 1727204168.63744: results queue empty 19665 1727204168.63746: checking for any_errors_fatal 19665 1727204168.63751: done checking for any_errors_fatal 19665 1727204168.63752: checking for max_fail_percentage 19665 1727204168.63753: done checking for max_fail_percentage 19665 1727204168.63754: checking to see if all hosts have failed and the running result is not ok 19665 1727204168.63755: done checking to see if all hosts have failed 19665 1727204168.63756: getting the remaining hosts for this loop 19665 1727204168.63758: done getting the remaining hosts for this loop 19665 1727204168.63762: getting the next task for host managed-node3 19665 1727204168.63771: done getting next task for host managed-node3 19665 1727204168.63775: ^ task is: TASK: Set NM profile exist flag based on the profile files 19665 1727204168.63779: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204168.63785: getting variables 19665 1727204168.63786: in VariableManager get_vars() 19665 1727204168.63817: Calling all_inventory to load vars for managed-node3 19665 1727204168.63819: Calling groups_inventory to load vars for managed-node3 19665 1727204168.63823: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204168.63834: Calling all_plugins_play to load vars for managed-node3 19665 1727204168.63840: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204168.63843: Calling groups_plugins_play to load vars for managed-node3 19665 1727204168.65762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204168.67592: done with get_vars() 19665 1727204168.67620: done getting variables 19665 1727204168.67707: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:56:08 -0400 (0:00:00.386) 0:00:19.544 ***** 19665 1727204168.67745: entering _queue_task() for managed-node3/set_fact 19665 1727204168.68136: worker is 1 (out of 1 available) 19665 1727204168.68151: exiting _queue_task() for managed-node3/set_fact 19665 1727204168.68166: done queuing things up, now waiting for results queue to drain 19665 1727204168.68167: waiting for pending results... 19665 1727204168.68489: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 19665 1727204168.68660: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000026e 19665 1727204168.68682: variable 'ansible_search_path' from source: unknown 19665 1727204168.68688: variable 'ansible_search_path' from source: unknown 19665 1727204168.68731: calling self._execute() 19665 1727204168.68843: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204168.68857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204168.68883: variable 'omit' from source: magic vars 19665 1727204168.69329: variable 'ansible_distribution_major_version' from source: facts 19665 1727204168.69342: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204168.69431: variable 'profile_stat' from source: set_fact 19665 1727204168.69445: Evaluated conditional (profile_stat.stat.exists): False 19665 1727204168.69448: when evaluation is False, skipping this task 19665 1727204168.69451: _execute() done 19665 1727204168.69454: dumping result to json 19665 1727204168.69456: done dumping result, returning 19665 1727204168.69461: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-0dcc-3ea6-00000000026e] 19665 1727204168.69467: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000026e 19665 1727204168.69557: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000026e 19665 1727204168.69560: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19665 1727204168.69626: no more pending results, returning what we have 19665 1727204168.69629: results queue empty 19665 1727204168.69630: checking for any_errors_fatal 19665 1727204168.69641: done checking for any_errors_fatal 19665 1727204168.69642: checking for max_fail_percentage 19665 1727204168.69644: done checking for max_fail_percentage 19665 1727204168.69645: checking to see if all hosts have failed and the running result is not ok 19665 1727204168.69645: done checking to see if all hosts have failed 19665 1727204168.69646: getting the remaining hosts for this loop 19665 1727204168.69648: done getting the remaining hosts for this loop 19665 1727204168.69653: getting the next task for host managed-node3 19665 1727204168.69660: done getting next task for host managed-node3 19665 1727204168.69668: ^ task is: TASK: Get NM profile info 19665 1727204168.69672: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204168.69678: getting variables 19665 1727204168.69680: in VariableManager get_vars() 19665 1727204168.69711: Calling all_inventory to load vars for managed-node3 19665 1727204168.69714: Calling groups_inventory to load vars for managed-node3 19665 1727204168.69717: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204168.69727: Calling all_plugins_play to load vars for managed-node3 19665 1727204168.69730: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204168.69732: Calling groups_plugins_play to load vars for managed-node3 19665 1727204168.70554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204168.71494: done with get_vars() 19665 1727204168.71512: done getting variables 19665 1727204168.71585: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:56:08 -0400 (0:00:00.038) 0:00:19.582 ***** 19665 1727204168.71608: entering _queue_task() for managed-node3/shell 19665 1727204168.71609: Creating lock for shell 19665 1727204168.71887: worker is 1 (out of 1 available) 19665 1727204168.71901: exiting _queue_task() for managed-node3/shell 19665 1727204168.71919: done queuing things up, now waiting for results queue to drain 19665 1727204168.71921: waiting for pending results... 19665 1727204168.72200: running TaskExecutor() for managed-node3/TASK: Get NM profile info 19665 1727204168.72333: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000026f 19665 1727204168.72353: variable 'ansible_search_path' from source: unknown 19665 1727204168.72360: variable 'ansible_search_path' from source: unknown 19665 1727204168.72408: calling self._execute() 19665 1727204168.72508: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204168.72526: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204168.72554: variable 'omit' from source: magic vars 19665 1727204168.72971: variable 'ansible_distribution_major_version' from source: facts 19665 1727204168.72989: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204168.72998: variable 'omit' from source: magic vars 19665 1727204168.73070: variable 'omit' from source: magic vars 19665 1727204168.73190: variable 'profile' from source: play vars 19665 1727204168.73199: variable 'interface' from source: set_fact 19665 1727204168.73265: variable 'interface' from source: set_fact 19665 1727204168.73292: variable 'omit' from source: magic vars 19665 1727204168.73345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204168.73386: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204168.73426: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204168.73450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204168.73466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204168.73499: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204168.73506: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204168.73516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204168.73619: Set connection var ansible_connection to ssh 19665 1727204168.73634: Set connection var ansible_shell_type to sh 19665 1727204168.73647: Set connection var ansible_timeout to 10 19665 1727204168.73657: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204168.73677: Set connection var ansible_shell_executable to /bin/sh 19665 1727204168.73695: Set connection var ansible_pipelining to False 19665 1727204168.73725: variable 'ansible_shell_executable' from source: unknown 19665 1727204168.73736: variable 'ansible_connection' from source: unknown 19665 1727204168.73746: variable 'ansible_module_compression' from source: unknown 19665 1727204168.73751: variable 'ansible_shell_type' from source: unknown 19665 1727204168.73757: variable 'ansible_shell_executable' from source: unknown 19665 1727204168.73766: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204168.73774: variable 'ansible_pipelining' from source: unknown 19665 1727204168.73784: variable 'ansible_timeout' from source: unknown 19665 1727204168.73795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204168.73935: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204168.73961: variable 'omit' from source: magic vars 19665 1727204168.73972: starting attempt loop 19665 1727204168.73979: running the handler 19665 1727204168.73992: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204168.74017: _low_level_execute_command(): starting 19665 1727204168.74023: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204168.74762: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204168.74769: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.74822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.74826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.74829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.74905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204168.74920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204168.75016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204168.76640: stdout chunk (state=3): >>>/root <<< 19665 1727204168.76741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204168.76808: stderr chunk (state=3): >>><<< 19665 1727204168.76811: stdout chunk (state=3): >>><<< 19665 1727204168.76833: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204168.76847: _low_level_execute_command(): starting 19665 1727204168.76853: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204168.7683294-21282-121597743752975 `" && echo ansible-tmp-1727204168.7683294-21282-121597743752975="` echo /root/.ansible/tmp/ansible-tmp-1727204168.7683294-21282-121597743752975 `" ) && sleep 0' 19665 1727204168.77319: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204168.77334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.77363: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204168.77370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.77422: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204168.77425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204168.77478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204168.79374: stdout chunk (state=3): >>>ansible-tmp-1727204168.7683294-21282-121597743752975=/root/.ansible/tmp/ansible-tmp-1727204168.7683294-21282-121597743752975 <<< 19665 1727204168.79488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204168.79548: stderr chunk (state=3): >>><<< 19665 1727204168.79554: stdout chunk (state=3): >>><<< 19665 1727204168.79575: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204168.7683294-21282-121597743752975=/root/.ansible/tmp/ansible-tmp-1727204168.7683294-21282-121597743752975 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204168.79603: variable 'ansible_module_compression' from source: unknown 19665 1727204168.79648: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19665 1727204168.79685: variable 'ansible_facts' from source: unknown 19665 1727204168.79735: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204168.7683294-21282-121597743752975/AnsiballZ_command.py 19665 1727204168.79850: Sending initial data 19665 1727204168.79853: Sent initial data (156 bytes) 19665 1727204168.80530: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.80537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.80586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.80589: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.80591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.80655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204168.80658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204168.80661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204168.80694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204168.82425: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 19665 1727204168.82431: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204168.82451: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 19665 1727204168.82471: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 19665 1727204168.82476: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 19665 1727204168.82532: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp3ptfdc2l /root/.ansible/tmp/ansible-tmp-1727204168.7683294-21282-121597743752975/AnsiballZ_command.py <<< 19665 1727204168.82588: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204168.83506: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204168.83681: stderr chunk (state=3): >>><<< 19665 1727204168.83684: stdout chunk (state=3): >>><<< 19665 1727204168.83720: done transferring module to remote 19665 1727204168.83734: _low_level_execute_command(): starting 19665 1727204168.83738: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204168.7683294-21282-121597743752975/ /root/.ansible/tmp/ansible-tmp-1727204168.7683294-21282-121597743752975/AnsiballZ_command.py && sleep 0' 19665 1727204168.84573: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204168.84586: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.84636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204168.84641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.84660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.84725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204168.84738: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204168.84800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204168.86505: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204168.86580: stderr chunk (state=3): >>><<< 19665 1727204168.86583: stdout chunk (state=3): >>><<< 19665 1727204168.86598: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204168.86601: _low_level_execute_command(): starting 19665 1727204168.86606: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204168.7683294-21282-121597743752975/AnsiballZ_command.py && sleep 0' 19665 1727204168.87096: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204168.87099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204168.87136: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.87139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204168.87142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204168.87197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204168.87200: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204168.87258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204169.02262: stdout chunk (state=3): >>> {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:56:09.002667", "end": "2024-09-24 14:56:09.021451", "delta": "0:00:00.018784", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19665 1727204169.03488: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204169.03549: stderr chunk (state=3): >>><<< 19665 1727204169.03553: stdout chunk (state=3): >>><<< 19665 1727204169.03568: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:56:09.002667", "end": "2024-09-24 14:56:09.021451", "delta": "0:00:00.018784", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204169.03606: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204168.7683294-21282-121597743752975/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204169.03612: _low_level_execute_command(): starting 19665 1727204169.03621: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204168.7683294-21282-121597743752975/ > /dev/null 2>&1 && sleep 0' 19665 1727204169.04093: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204169.04112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204169.04115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204169.04160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204169.04170: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204169.04173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204169.04242: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204169.04275: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204169.04301: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204169.04339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204169.06137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204169.06263: stderr chunk (state=3): >>><<< 19665 1727204169.06283: stdout chunk (state=3): >>><<< 19665 1727204169.06307: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204169.06351: handler run complete 19665 1727204169.06398: Evaluated conditional (False): False 19665 1727204169.06439: attempt loop complete, returning result 19665 1727204169.06445: _execute() done 19665 1727204169.06448: dumping result to json 19665 1727204169.06469: done dumping result, returning 19665 1727204169.06493: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [0affcd87-79f5-0dcc-3ea6-00000000026f] 19665 1727204169.06499: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000026f 19665 1727204169.06624: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000026f 19665 1727204169.06627: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.018784", "end": "2024-09-24 14:56:09.021451", "rc": 0, "start": "2024-09-24 14:56:09.002667" } STDOUT: LSR-TST-br31 /etc/NetworkManager/system-connections/LSR-TST-br31.nmconnection 19665 1727204169.06706: no more pending results, returning what we have 19665 1727204169.06711: results queue empty 19665 1727204169.06713: checking for any_errors_fatal 19665 1727204169.06721: done checking for any_errors_fatal 19665 1727204169.06722: checking for max_fail_percentage 19665 1727204169.06723: done checking for max_fail_percentage 19665 1727204169.06724: checking to see if all hosts have failed and the running result is not ok 19665 1727204169.06725: done checking to see if all hosts have failed 19665 1727204169.06725: getting the remaining hosts for this loop 19665 1727204169.06727: done getting the remaining hosts for this loop 19665 1727204169.06732: getting the next task for host managed-node3 19665 1727204169.06740: done getting next task for host managed-node3 19665 1727204169.06744: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 19665 1727204169.06748: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204169.06754: getting variables 19665 1727204169.06756: in VariableManager get_vars() 19665 1727204169.06786: Calling all_inventory to load vars for managed-node3 19665 1727204169.06789: Calling groups_inventory to load vars for managed-node3 19665 1727204169.06792: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204169.06804: Calling all_plugins_play to load vars for managed-node3 19665 1727204169.06807: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204169.06809: Calling groups_plugins_play to load vars for managed-node3 19665 1727204169.08360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204169.10010: done with get_vars() 19665 1727204169.10035: done getting variables 19665 1727204169.10094: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.385) 0:00:19.967 ***** 19665 1727204169.10130: entering _queue_task() for managed-node3/set_fact 19665 1727204169.11188: worker is 1 (out of 1 available) 19665 1727204169.11198: exiting _queue_task() for managed-node3/set_fact 19665 1727204169.11209: done queuing things up, now waiting for results queue to drain 19665 1727204169.11213: waiting for pending results... 19665 1727204169.11480: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 19665 1727204169.11758: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000270 19665 1727204169.11800: variable 'ansible_search_path' from source: unknown 19665 1727204169.11808: variable 'ansible_search_path' from source: unknown 19665 1727204169.11849: calling self._execute() 19665 1727204169.11947: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.11959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.11978: variable 'omit' from source: magic vars 19665 1727204169.12397: variable 'ansible_distribution_major_version' from source: facts 19665 1727204169.12415: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204169.12558: variable 'nm_profile_exists' from source: set_fact 19665 1727204169.12582: Evaluated conditional (nm_profile_exists.rc == 0): True 19665 1727204169.12597: variable 'omit' from source: magic vars 19665 1727204169.12650: variable 'omit' from source: magic vars 19665 1727204169.12694: variable 'omit' from source: magic vars 19665 1727204169.12744: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204169.12793: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204169.12823: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204169.12847: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204169.12869: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204169.12912: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204169.12921: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.12932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.13042: Set connection var ansible_connection to ssh 19665 1727204169.13056: Set connection var ansible_shell_type to sh 19665 1727204169.13070: Set connection var ansible_timeout to 10 19665 1727204169.13080: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204169.13096: Set connection var ansible_shell_executable to /bin/sh 19665 1727204169.13108: Set connection var ansible_pipelining to False 19665 1727204169.13134: variable 'ansible_shell_executable' from source: unknown 19665 1727204169.13143: variable 'ansible_connection' from source: unknown 19665 1727204169.13150: variable 'ansible_module_compression' from source: unknown 19665 1727204169.13159: variable 'ansible_shell_type' from source: unknown 19665 1727204169.13172: variable 'ansible_shell_executable' from source: unknown 19665 1727204169.13180: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.13187: variable 'ansible_pipelining' from source: unknown 19665 1727204169.13195: variable 'ansible_timeout' from source: unknown 19665 1727204169.13206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.13360: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204169.13384: variable 'omit' from source: magic vars 19665 1727204169.13399: starting attempt loop 19665 1727204169.13408: running the handler 19665 1727204169.13429: handler run complete 19665 1727204169.13444: attempt loop complete, returning result 19665 1727204169.13452: _execute() done 19665 1727204169.13459: dumping result to json 19665 1727204169.13469: done dumping result, returning 19665 1727204169.13481: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-0dcc-3ea6-000000000270] 19665 1727204169.13491: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000270 ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 19665 1727204169.13653: no more pending results, returning what we have 19665 1727204169.13656: results queue empty 19665 1727204169.13657: checking for any_errors_fatal 19665 1727204169.13667: done checking for any_errors_fatal 19665 1727204169.13668: checking for max_fail_percentage 19665 1727204169.13670: done checking for max_fail_percentage 19665 1727204169.13671: checking to see if all hosts have failed and the running result is not ok 19665 1727204169.13672: done checking to see if all hosts have failed 19665 1727204169.13673: getting the remaining hosts for this loop 19665 1727204169.13675: done getting the remaining hosts for this loop 19665 1727204169.13680: getting the next task for host managed-node3 19665 1727204169.13692: done getting next task for host managed-node3 19665 1727204169.13695: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 19665 1727204169.13699: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204169.13703: getting variables 19665 1727204169.13705: in VariableManager get_vars() 19665 1727204169.13740: Calling all_inventory to load vars for managed-node3 19665 1727204169.13743: Calling groups_inventory to load vars for managed-node3 19665 1727204169.13748: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204169.13762: Calling all_plugins_play to load vars for managed-node3 19665 1727204169.13768: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204169.13773: Calling groups_plugins_play to load vars for managed-node3 19665 1727204169.15172: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000270 19665 1727204169.15176: WORKER PROCESS EXITING 19665 1727204169.16243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204169.18553: done with get_vars() 19665 1727204169.18588: done getting variables 19665 1727204169.18657: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204169.18797: variable 'profile' from source: play vars 19665 1727204169.18802: variable 'interface' from source: set_fact 19665 1727204169.18875: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.087) 0:00:20.055 ***** 19665 1727204169.18913: entering _queue_task() for managed-node3/command 19665 1727204169.19261: worker is 1 (out of 1 available) 19665 1727204169.19276: exiting _queue_task() for managed-node3/command 19665 1727204169.19289: done queuing things up, now waiting for results queue to drain 19665 1727204169.19291: waiting for pending results... 19665 1727204169.20119: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 19665 1727204169.20401: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000272 19665 1727204169.20432: variable 'ansible_search_path' from source: unknown 19665 1727204169.20446: variable 'ansible_search_path' from source: unknown 19665 1727204169.20484: calling self._execute() 19665 1727204169.20593: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.20604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.20618: variable 'omit' from source: magic vars 19665 1727204169.20967: variable 'ansible_distribution_major_version' from source: facts 19665 1727204169.20976: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204169.21069: variable 'profile_stat' from source: set_fact 19665 1727204169.21081: Evaluated conditional (profile_stat.stat.exists): False 19665 1727204169.21084: when evaluation is False, skipping this task 19665 1727204169.21087: _execute() done 19665 1727204169.21089: dumping result to json 19665 1727204169.21091: done dumping result, returning 19665 1727204169.21097: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [0affcd87-79f5-0dcc-3ea6-000000000272] 19665 1727204169.21104: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000272 19665 1727204169.21203: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000272 19665 1727204169.21206: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19665 1727204169.21254: no more pending results, returning what we have 19665 1727204169.21258: results queue empty 19665 1727204169.21259: checking for any_errors_fatal 19665 1727204169.21268: done checking for any_errors_fatal 19665 1727204169.21269: checking for max_fail_percentage 19665 1727204169.21271: done checking for max_fail_percentage 19665 1727204169.21272: checking to see if all hosts have failed and the running result is not ok 19665 1727204169.21273: done checking to see if all hosts have failed 19665 1727204169.21273: getting the remaining hosts for this loop 19665 1727204169.21276: done getting the remaining hosts for this loop 19665 1727204169.21281: getting the next task for host managed-node3 19665 1727204169.21289: done getting next task for host managed-node3 19665 1727204169.21291: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 19665 1727204169.21295: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204169.21298: getting variables 19665 1727204169.21300: in VariableManager get_vars() 19665 1727204169.21329: Calling all_inventory to load vars for managed-node3 19665 1727204169.21331: Calling groups_inventory to load vars for managed-node3 19665 1727204169.21335: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204169.21347: Calling all_plugins_play to load vars for managed-node3 19665 1727204169.21351: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204169.21353: Calling groups_plugins_play to load vars for managed-node3 19665 1727204169.22201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204169.24628: done with get_vars() 19665 1727204169.24666: done getting variables 19665 1727204169.24737: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204169.24861: variable 'profile' from source: play vars 19665 1727204169.24867: variable 'interface' from source: set_fact 19665 1727204169.24937: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.060) 0:00:20.116 ***** 19665 1727204169.24974: entering _queue_task() for managed-node3/set_fact 19665 1727204169.25330: worker is 1 (out of 1 available) 19665 1727204169.25341: exiting _queue_task() for managed-node3/set_fact 19665 1727204169.25358: done queuing things up, now waiting for results queue to drain 19665 1727204169.25360: waiting for pending results... 19665 1727204169.25639: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 19665 1727204169.25776: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000273 19665 1727204169.25806: variable 'ansible_search_path' from source: unknown 19665 1727204169.25813: variable 'ansible_search_path' from source: unknown 19665 1727204169.25854: calling self._execute() 19665 1727204169.25959: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.25972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.25986: variable 'omit' from source: magic vars 19665 1727204169.26389: variable 'ansible_distribution_major_version' from source: facts 19665 1727204169.26408: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204169.26557: variable 'profile_stat' from source: set_fact 19665 1727204169.26582: Evaluated conditional (profile_stat.stat.exists): False 19665 1727204169.26590: when evaluation is False, skipping this task 19665 1727204169.26597: _execute() done 19665 1727204169.26603: dumping result to json 19665 1727204169.26609: done dumping result, returning 19665 1727204169.26619: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [0affcd87-79f5-0dcc-3ea6-000000000273] 19665 1727204169.26628: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000273 19665 1727204169.26747: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000273 19665 1727204169.26754: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19665 1727204169.26813: no more pending results, returning what we have 19665 1727204169.26818: results queue empty 19665 1727204169.26819: checking for any_errors_fatal 19665 1727204169.26828: done checking for any_errors_fatal 19665 1727204169.26829: checking for max_fail_percentage 19665 1727204169.26830: done checking for max_fail_percentage 19665 1727204169.26835: checking to see if all hosts have failed and the running result is not ok 19665 1727204169.26836: done checking to see if all hosts have failed 19665 1727204169.26837: getting the remaining hosts for this loop 19665 1727204169.26839: done getting the remaining hosts for this loop 19665 1727204169.26843: getting the next task for host managed-node3 19665 1727204169.26854: done getting next task for host managed-node3 19665 1727204169.26858: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 19665 1727204169.26862: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204169.26869: getting variables 19665 1727204169.26871: in VariableManager get_vars() 19665 1727204169.26902: Calling all_inventory to load vars for managed-node3 19665 1727204169.26904: Calling groups_inventory to load vars for managed-node3 19665 1727204169.26908: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204169.26923: Calling all_plugins_play to load vars for managed-node3 19665 1727204169.26926: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204169.26929: Calling groups_plugins_play to load vars for managed-node3 19665 1727204169.28947: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204169.31611: done with get_vars() 19665 1727204169.31651: done getting variables 19665 1727204169.31718: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204169.31840: variable 'profile' from source: play vars 19665 1727204169.31845: variable 'interface' from source: set_fact 19665 1727204169.31907: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.069) 0:00:20.186 ***** 19665 1727204169.31943: entering _queue_task() for managed-node3/command 19665 1727204169.32293: worker is 1 (out of 1 available) 19665 1727204169.32306: exiting _queue_task() for managed-node3/command 19665 1727204169.32319: done queuing things up, now waiting for results queue to drain 19665 1727204169.32320: waiting for pending results... 19665 1727204169.32613: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 19665 1727204169.32768: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000274 19665 1727204169.32795: variable 'ansible_search_path' from source: unknown 19665 1727204169.32802: variable 'ansible_search_path' from source: unknown 19665 1727204169.32842: calling self._execute() 19665 1727204169.32946: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.32959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.32982: variable 'omit' from source: magic vars 19665 1727204169.33392: variable 'ansible_distribution_major_version' from source: facts 19665 1727204169.33410: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204169.33552: variable 'profile_stat' from source: set_fact 19665 1727204169.33574: Evaluated conditional (profile_stat.stat.exists): False 19665 1727204169.33584: when evaluation is False, skipping this task 19665 1727204169.33592: _execute() done 19665 1727204169.33600: dumping result to json 19665 1727204169.33608: done dumping result, returning 19665 1727204169.33621: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [0affcd87-79f5-0dcc-3ea6-000000000274] 19665 1727204169.33637: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000274 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19665 1727204169.33792: no more pending results, returning what we have 19665 1727204169.33796: results queue empty 19665 1727204169.33797: checking for any_errors_fatal 19665 1727204169.33804: done checking for any_errors_fatal 19665 1727204169.33805: checking for max_fail_percentage 19665 1727204169.33806: done checking for max_fail_percentage 19665 1727204169.33807: checking to see if all hosts have failed and the running result is not ok 19665 1727204169.33808: done checking to see if all hosts have failed 19665 1727204169.33809: getting the remaining hosts for this loop 19665 1727204169.33811: done getting the remaining hosts for this loop 19665 1727204169.33816: getting the next task for host managed-node3 19665 1727204169.33825: done getting next task for host managed-node3 19665 1727204169.33827: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 19665 1727204169.33831: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204169.33835: getting variables 19665 1727204169.33837: in VariableManager get_vars() 19665 1727204169.33869: Calling all_inventory to load vars for managed-node3 19665 1727204169.33872: Calling groups_inventory to load vars for managed-node3 19665 1727204169.33876: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204169.33893: Calling all_plugins_play to load vars for managed-node3 19665 1727204169.33897: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204169.33903: Calling groups_plugins_play to load vars for managed-node3 19665 1727204169.34950: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000274 19665 1727204169.34954: WORKER PROCESS EXITING 19665 1727204169.36781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204169.38842: done with get_vars() 19665 1727204169.38979: done getting variables 19665 1727204169.39045: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204169.39352: variable 'profile' from source: play vars 19665 1727204169.39356: variable 'interface' from source: set_fact 19665 1727204169.39428: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.075) 0:00:20.261 ***** 19665 1727204169.39462: entering _queue_task() for managed-node3/set_fact 19665 1727204169.40091: worker is 1 (out of 1 available) 19665 1727204169.40104: exiting _queue_task() for managed-node3/set_fact 19665 1727204169.40115: done queuing things up, now waiting for results queue to drain 19665 1727204169.40117: waiting for pending results... 19665 1727204169.40788: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 19665 1727204169.41155: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000275 19665 1727204169.41258: variable 'ansible_search_path' from source: unknown 19665 1727204169.41271: variable 'ansible_search_path' from source: unknown 19665 1727204169.41314: calling self._execute() 19665 1727204169.41567: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.41584: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.41603: variable 'omit' from source: magic vars 19665 1727204169.42494: variable 'ansible_distribution_major_version' from source: facts 19665 1727204169.42513: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204169.42771: variable 'profile_stat' from source: set_fact 19665 1727204169.42796: Evaluated conditional (profile_stat.stat.exists): False 19665 1727204169.42805: when evaluation is False, skipping this task 19665 1727204169.42813: _execute() done 19665 1727204169.42821: dumping result to json 19665 1727204169.42829: done dumping result, returning 19665 1727204169.42842: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [0affcd87-79f5-0dcc-3ea6-000000000275] 19665 1727204169.42853: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000275 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19665 1727204169.43015: no more pending results, returning what we have 19665 1727204169.43020: results queue empty 19665 1727204169.43021: checking for any_errors_fatal 19665 1727204169.43029: done checking for any_errors_fatal 19665 1727204169.43030: checking for max_fail_percentage 19665 1727204169.43032: done checking for max_fail_percentage 19665 1727204169.43033: checking to see if all hosts have failed and the running result is not ok 19665 1727204169.43034: done checking to see if all hosts have failed 19665 1727204169.43035: getting the remaining hosts for this loop 19665 1727204169.43037: done getting the remaining hosts for this loop 19665 1727204169.43045: getting the next task for host managed-node3 19665 1727204169.43054: done getting next task for host managed-node3 19665 1727204169.43057: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 19665 1727204169.43061: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204169.43068: getting variables 19665 1727204169.43070: in VariableManager get_vars() 19665 1727204169.43104: Calling all_inventory to load vars for managed-node3 19665 1727204169.43107: Calling groups_inventory to load vars for managed-node3 19665 1727204169.43111: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204169.43126: Calling all_plugins_play to load vars for managed-node3 19665 1727204169.43130: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204169.43134: Calling groups_plugins_play to load vars for managed-node3 19665 1727204169.44201: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000275 19665 1727204169.44205: WORKER PROCESS EXITING 19665 1727204169.45234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204169.47112: done with get_vars() 19665 1727204169.47149: done getting variables 19665 1727204169.47219: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204169.47357: variable 'profile' from source: play vars 19665 1727204169.47361: variable 'interface' from source: set_fact 19665 1727204169.47435: variable 'interface' from source: set_fact TASK [Assert that the profile is present - 'LSR-TST-br31'] ********************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.080) 0:00:20.341 ***** 19665 1727204169.47473: entering _queue_task() for managed-node3/assert 19665 1727204169.47851: worker is 1 (out of 1 available) 19665 1727204169.47868: exiting _queue_task() for managed-node3/assert 19665 1727204169.47880: done queuing things up, now waiting for results queue to drain 19665 1727204169.47882: waiting for pending results... 19665 1727204169.48176: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'LSR-TST-br31' 19665 1727204169.48293: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000260 19665 1727204169.48314: variable 'ansible_search_path' from source: unknown 19665 1727204169.48326: variable 'ansible_search_path' from source: unknown 19665 1727204169.48370: calling self._execute() 19665 1727204169.48481: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.48497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.48511: variable 'omit' from source: magic vars 19665 1727204169.48922: variable 'ansible_distribution_major_version' from source: facts 19665 1727204169.48945: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204169.48958: variable 'omit' from source: magic vars 19665 1727204169.49008: variable 'omit' from source: magic vars 19665 1727204169.49125: variable 'profile' from source: play vars 19665 1727204169.49145: variable 'interface' from source: set_fact 19665 1727204169.49218: variable 'interface' from source: set_fact 19665 1727204169.49248: variable 'omit' from source: magic vars 19665 1727204169.49304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204169.49348: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204169.49384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204169.49412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204169.49427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204169.49470: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204169.49482: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.49490: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.49605: Set connection var ansible_connection to ssh 19665 1727204169.49619: Set connection var ansible_shell_type to sh 19665 1727204169.49632: Set connection var ansible_timeout to 10 19665 1727204169.49644: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204169.49655: Set connection var ansible_shell_executable to /bin/sh 19665 1727204169.49667: Set connection var ansible_pipelining to False 19665 1727204169.49700: variable 'ansible_shell_executable' from source: unknown 19665 1727204169.49707: variable 'ansible_connection' from source: unknown 19665 1727204169.49716: variable 'ansible_module_compression' from source: unknown 19665 1727204169.49722: variable 'ansible_shell_type' from source: unknown 19665 1727204169.49728: variable 'ansible_shell_executable' from source: unknown 19665 1727204169.49737: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.49747: variable 'ansible_pipelining' from source: unknown 19665 1727204169.49752: variable 'ansible_timeout' from source: unknown 19665 1727204169.49759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.49925: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204169.49945: variable 'omit' from source: magic vars 19665 1727204169.49960: starting attempt loop 19665 1727204169.49971: running the handler 19665 1727204169.50106: variable 'lsr_net_profile_exists' from source: set_fact 19665 1727204169.50123: Evaluated conditional (lsr_net_profile_exists): True 19665 1727204169.50137: handler run complete 19665 1727204169.50160: attempt loop complete, returning result 19665 1727204169.50173: _execute() done 19665 1727204169.50180: dumping result to json 19665 1727204169.50187: done dumping result, returning 19665 1727204169.50198: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'LSR-TST-br31' [0affcd87-79f5-0dcc-3ea6-000000000260] 19665 1727204169.50208: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000260 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 19665 1727204169.50373: no more pending results, returning what we have 19665 1727204169.50377: results queue empty 19665 1727204169.50378: checking for any_errors_fatal 19665 1727204169.50388: done checking for any_errors_fatal 19665 1727204169.50389: checking for max_fail_percentage 19665 1727204169.50390: done checking for max_fail_percentage 19665 1727204169.50391: checking to see if all hosts have failed and the running result is not ok 19665 1727204169.50392: done checking to see if all hosts have failed 19665 1727204169.50393: getting the remaining hosts for this loop 19665 1727204169.50395: done getting the remaining hosts for this loop 19665 1727204169.50400: getting the next task for host managed-node3 19665 1727204169.50407: done getting next task for host managed-node3 19665 1727204169.50410: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 19665 1727204169.50414: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204169.50419: getting variables 19665 1727204169.50421: in VariableManager get_vars() 19665 1727204169.50457: Calling all_inventory to load vars for managed-node3 19665 1727204169.50460: Calling groups_inventory to load vars for managed-node3 19665 1727204169.50466: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204169.50479: Calling all_plugins_play to load vars for managed-node3 19665 1727204169.50483: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204169.50486: Calling groups_plugins_play to load vars for managed-node3 19665 1727204169.51518: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000260 19665 1727204169.51522: WORKER PROCESS EXITING 19665 1727204169.53965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204169.61066: done with get_vars() 19665 1727204169.61218: done getting variables 19665 1727204169.61283: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204169.61529: variable 'profile' from source: play vars 19665 1727204169.61648: variable 'interface' from source: set_fact 19665 1727204169.61713: variable 'interface' from source: set_fact TASK [Assert that the ansible managed comment is present in 'LSR-TST-br31'] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.142) 0:00:20.484 ***** 19665 1727204169.61869: entering _queue_task() for managed-node3/assert 19665 1727204169.62581: worker is 1 (out of 1 available) 19665 1727204169.62592: exiting _queue_task() for managed-node3/assert 19665 1727204169.62605: done queuing things up, now waiting for results queue to drain 19665 1727204169.62607: waiting for pending results... 19665 1727204169.63650: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' 19665 1727204169.63854: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000261 19665 1727204169.63868: variable 'ansible_search_path' from source: unknown 19665 1727204169.64094: variable 'ansible_search_path' from source: unknown 19665 1727204169.64132: calling self._execute() 19665 1727204169.64345: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.64349: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.64360: variable 'omit' from source: magic vars 19665 1727204169.64847: variable 'ansible_distribution_major_version' from source: facts 19665 1727204169.64867: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204169.64873: variable 'omit' from source: magic vars 19665 1727204169.64911: variable 'omit' from source: magic vars 19665 1727204169.65021: variable 'profile' from source: play vars 19665 1727204169.65024: variable 'interface' from source: set_fact 19665 1727204169.65201: variable 'interface' from source: set_fact 19665 1727204169.65221: variable 'omit' from source: magic vars 19665 1727204169.65261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204169.65385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204169.65486: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204169.65502: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204169.65518: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204169.65553: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204169.65556: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.65559: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.65835: Set connection var ansible_connection to ssh 19665 1727204169.65844: Set connection var ansible_shell_type to sh 19665 1727204169.65851: Set connection var ansible_timeout to 10 19665 1727204169.65857: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204169.65866: Set connection var ansible_shell_executable to /bin/sh 19665 1727204169.65874: Set connection var ansible_pipelining to False 19665 1727204169.65900: variable 'ansible_shell_executable' from source: unknown 19665 1727204169.65903: variable 'ansible_connection' from source: unknown 19665 1727204169.65906: variable 'ansible_module_compression' from source: unknown 19665 1727204169.65908: variable 'ansible_shell_type' from source: unknown 19665 1727204169.65910: variable 'ansible_shell_executable' from source: unknown 19665 1727204169.65913: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.65915: variable 'ansible_pipelining' from source: unknown 19665 1727204169.65917: variable 'ansible_timeout' from source: unknown 19665 1727204169.65924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.66376: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204169.66397: variable 'omit' from source: magic vars 19665 1727204169.66409: starting attempt loop 19665 1727204169.66417: running the handler 19665 1727204169.66591: variable 'lsr_net_profile_ansible_managed' from source: set_fact 19665 1727204169.66607: Evaluated conditional (lsr_net_profile_ansible_managed): True 19665 1727204169.66617: handler run complete 19665 1727204169.66637: attempt loop complete, returning result 19665 1727204169.66648: _execute() done 19665 1727204169.66654: dumping result to json 19665 1727204169.66661: done dumping result, returning 19665 1727204169.66698: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'LSR-TST-br31' [0affcd87-79f5-0dcc-3ea6-000000000261] 19665 1727204169.66730: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000261 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 19665 1727204169.66884: no more pending results, returning what we have 19665 1727204169.66889: results queue empty 19665 1727204169.66890: checking for any_errors_fatal 19665 1727204169.66897: done checking for any_errors_fatal 19665 1727204169.66898: checking for max_fail_percentage 19665 1727204169.66900: done checking for max_fail_percentage 19665 1727204169.66901: checking to see if all hosts have failed and the running result is not ok 19665 1727204169.66902: done checking to see if all hosts have failed 19665 1727204169.66903: getting the remaining hosts for this loop 19665 1727204169.66905: done getting the remaining hosts for this loop 19665 1727204169.66910: getting the next task for host managed-node3 19665 1727204169.66918: done getting next task for host managed-node3 19665 1727204169.66920: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 19665 1727204169.66924: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204169.66928: getting variables 19665 1727204169.66930: in VariableManager get_vars() 19665 1727204169.66968: Calling all_inventory to load vars for managed-node3 19665 1727204169.66971: Calling groups_inventory to load vars for managed-node3 19665 1727204169.66975: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204169.66989: Calling all_plugins_play to load vars for managed-node3 19665 1727204169.66993: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204169.66996: Calling groups_plugins_play to load vars for managed-node3 19665 1727204169.67845: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000261 19665 1727204169.67849: WORKER PROCESS EXITING 19665 1727204169.68268: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204169.69777: done with get_vars() 19665 1727204169.69806: done getting variables 19665 1727204169.69971: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204169.70103: variable 'profile' from source: play vars 19665 1727204169.70107: variable 'interface' from source: set_fact 19665 1727204169.70297: variable 'interface' from source: set_fact TASK [Assert that the fingerprint comment is present in LSR-TST-br31] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:56:09 -0400 (0:00:00.085) 0:00:20.569 ***** 19665 1727204169.70343: entering _queue_task() for managed-node3/assert 19665 1727204169.71031: worker is 1 (out of 1 available) 19665 1727204169.71043: exiting _queue_task() for managed-node3/assert 19665 1727204169.71056: done queuing things up, now waiting for results queue to drain 19665 1727204169.71058: waiting for pending results... 19665 1727204169.71932: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 19665 1727204169.72068: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000262 19665 1727204169.72102: variable 'ansible_search_path' from source: unknown 19665 1727204169.72110: variable 'ansible_search_path' from source: unknown 19665 1727204169.72154: calling self._execute() 19665 1727204169.72269: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.72281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.72307: variable 'omit' from source: magic vars 19665 1727204169.72836: variable 'ansible_distribution_major_version' from source: facts 19665 1727204169.72869: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204169.72880: variable 'omit' from source: magic vars 19665 1727204169.72919: variable 'omit' from source: magic vars 19665 1727204169.73034: variable 'profile' from source: play vars 19665 1727204169.73044: variable 'interface' from source: set_fact 19665 1727204169.73122: variable 'interface' from source: set_fact 19665 1727204169.73178: variable 'omit' from source: magic vars 19665 1727204169.73232: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204169.73274: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204169.73315: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204169.73339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204169.73356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204169.73399: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204169.73418: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.73426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.74644: Set connection var ansible_connection to ssh 19665 1727204169.74737: Set connection var ansible_shell_type to sh 19665 1727204169.74748: Set connection var ansible_timeout to 10 19665 1727204169.74757: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204169.74777: Set connection var ansible_shell_executable to /bin/sh 19665 1727204169.74789: Set connection var ansible_pipelining to False 19665 1727204169.74818: variable 'ansible_shell_executable' from source: unknown 19665 1727204169.74902: variable 'ansible_connection' from source: unknown 19665 1727204169.74918: variable 'ansible_module_compression' from source: unknown 19665 1727204169.74926: variable 'ansible_shell_type' from source: unknown 19665 1727204169.74935: variable 'ansible_shell_executable' from source: unknown 19665 1727204169.74951: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204169.74978: variable 'ansible_pipelining' from source: unknown 19665 1727204169.75001: variable 'ansible_timeout' from source: unknown 19665 1727204169.75035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204169.75431: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204169.75447: variable 'omit' from source: magic vars 19665 1727204169.75456: starting attempt loop 19665 1727204169.75498: running the handler 19665 1727204169.75736: variable 'lsr_net_profile_fingerprint' from source: set_fact 19665 1727204169.75747: Evaluated conditional (lsr_net_profile_fingerprint): True 19665 1727204169.75774: handler run complete 19665 1727204169.75792: attempt loop complete, returning result 19665 1727204169.75824: _execute() done 19665 1727204169.75833: dumping result to json 19665 1727204169.75934: done dumping result, returning 19665 1727204169.75946: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in LSR-TST-br31 [0affcd87-79f5-0dcc-3ea6-000000000262] 19665 1727204169.75955: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000262 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 19665 1727204169.76096: no more pending results, returning what we have 19665 1727204169.76101: results queue empty 19665 1727204169.76102: checking for any_errors_fatal 19665 1727204169.76109: done checking for any_errors_fatal 19665 1727204169.76110: checking for max_fail_percentage 19665 1727204169.76112: done checking for max_fail_percentage 19665 1727204169.76113: checking to see if all hosts have failed and the running result is not ok 19665 1727204169.76114: done checking to see if all hosts have failed 19665 1727204169.76115: getting the remaining hosts for this loop 19665 1727204169.76117: done getting the remaining hosts for this loop 19665 1727204169.76121: getting the next task for host managed-node3 19665 1727204169.76130: done getting next task for host managed-node3 19665 1727204169.76133: ^ task is: TASK: meta (flush_handlers) 19665 1727204169.76135: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204169.76139: getting variables 19665 1727204169.76141: in VariableManager get_vars() 19665 1727204169.76178: Calling all_inventory to load vars for managed-node3 19665 1727204169.76180: Calling groups_inventory to load vars for managed-node3 19665 1727204169.76184: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204169.76196: Calling all_plugins_play to load vars for managed-node3 19665 1727204169.76200: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204169.76203: Calling groups_plugins_play to load vars for managed-node3 19665 1727204169.77233: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000262 19665 1727204169.77236: WORKER PROCESS EXITING 19665 1727204169.79354: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204169.85889: done with get_vars() 19665 1727204169.86054: done getting variables 19665 1727204169.86281: in VariableManager get_vars() 19665 1727204169.86293: Calling all_inventory to load vars for managed-node3 19665 1727204169.86296: Calling groups_inventory to load vars for managed-node3 19665 1727204169.86298: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204169.86304: Calling all_plugins_play to load vars for managed-node3 19665 1727204169.86306: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204169.86308: Calling groups_plugins_play to load vars for managed-node3 19665 1727204169.89887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204169.93184: done with get_vars() 19665 1727204169.93227: done queuing things up, now waiting for results queue to drain 19665 1727204169.93229: results queue empty 19665 1727204169.93230: checking for any_errors_fatal 19665 1727204169.93233: done checking for any_errors_fatal 19665 1727204169.93234: checking for max_fail_percentage 19665 1727204169.93235: done checking for max_fail_percentage 19665 1727204169.93242: checking to see if all hosts have failed and the running result is not ok 19665 1727204169.93243: done checking to see if all hosts have failed 19665 1727204169.93244: getting the remaining hosts for this loop 19665 1727204169.93245: done getting the remaining hosts for this loop 19665 1727204169.93248: getting the next task for host managed-node3 19665 1727204169.93252: done getting next task for host managed-node3 19665 1727204169.93253: ^ task is: TASK: meta (flush_handlers) 19665 1727204169.93255: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204169.93258: getting variables 19665 1727204169.93259: in VariableManager get_vars() 19665 1727204169.93272: Calling all_inventory to load vars for managed-node3 19665 1727204169.93275: Calling groups_inventory to load vars for managed-node3 19665 1727204169.93277: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204169.93283: Calling all_plugins_play to load vars for managed-node3 19665 1727204169.93285: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204169.93288: Calling groups_plugins_play to load vars for managed-node3 19665 1727204169.95393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204169.97913: done with get_vars() 19665 1727204169.98018: done getting variables 19665 1727204169.98154: in VariableManager get_vars() 19665 1727204169.98216: Calling all_inventory to load vars for managed-node3 19665 1727204169.98219: Calling groups_inventory to load vars for managed-node3 19665 1727204169.98222: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204169.98228: Calling all_plugins_play to load vars for managed-node3 19665 1727204169.98231: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204169.98234: Calling groups_plugins_play to load vars for managed-node3 19665 1727204169.99757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204170.01760: done with get_vars() 19665 1727204170.01796: done queuing things up, now waiting for results queue to drain 19665 1727204170.01798: results queue empty 19665 1727204170.01799: checking for any_errors_fatal 19665 1727204170.01800: done checking for any_errors_fatal 19665 1727204170.01801: checking for max_fail_percentage 19665 1727204170.01802: done checking for max_fail_percentage 19665 1727204170.01803: checking to see if all hosts have failed and the running result is not ok 19665 1727204170.01804: done checking to see if all hosts have failed 19665 1727204170.01805: getting the remaining hosts for this loop 19665 1727204170.01806: done getting the remaining hosts for this loop 19665 1727204170.01808: getting the next task for host managed-node3 19665 1727204170.01812: done getting next task for host managed-node3 19665 1727204170.01812: ^ task is: None 19665 1727204170.01814: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204170.01815: done queuing things up, now waiting for results queue to drain 19665 1727204170.01816: results queue empty 19665 1727204170.01817: checking for any_errors_fatal 19665 1727204170.01817: done checking for any_errors_fatal 19665 1727204170.01818: checking for max_fail_percentage 19665 1727204170.01819: done checking for max_fail_percentage 19665 1727204170.01820: checking to see if all hosts have failed and the running result is not ok 19665 1727204170.01820: done checking to see if all hosts have failed 19665 1727204170.01821: getting the next task for host managed-node3 19665 1727204170.01824: done getting next task for host managed-node3 19665 1727204170.01824: ^ task is: None 19665 1727204170.01825: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204170.01946: in VariableManager get_vars() 19665 1727204170.01971: done with get_vars() 19665 1727204170.01978: in VariableManager get_vars() 19665 1727204170.01990: done with get_vars() 19665 1727204170.01995: variable 'omit' from source: magic vars 19665 1727204170.02128: variable 'profile' from source: play vars 19665 1727204170.02326: in VariableManager get_vars() 19665 1727204170.02351: done with get_vars() 19665 1727204170.02384: variable 'omit' from source: magic vars 19665 1727204170.02452: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 19665 1727204170.03459: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19665 1727204170.03486: getting the remaining hosts for this loop 19665 1727204170.03488: done getting the remaining hosts for this loop 19665 1727204170.03490: getting the next task for host managed-node3 19665 1727204170.03493: done getting next task for host managed-node3 19665 1727204170.03495: ^ task is: TASK: Gathering Facts 19665 1727204170.03496: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204170.03498: getting variables 19665 1727204170.03499: in VariableManager get_vars() 19665 1727204170.03582: Calling all_inventory to load vars for managed-node3 19665 1727204170.03585: Calling groups_inventory to load vars for managed-node3 19665 1727204170.03588: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204170.03594: Calling all_plugins_play to load vars for managed-node3 19665 1727204170.03597: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204170.03600: Calling groups_plugins_play to load vars for managed-node3 19665 1727204170.05061: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204170.08998: done with get_vars() 19665 1727204170.09031: done getting variables 19665 1727204170.09105: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 14:56:10 -0400 (0:00:00.387) 0:00:20.957 ***** 19665 1727204170.09136: entering _queue_task() for managed-node3/gather_facts 19665 1727204170.09467: worker is 1 (out of 1 available) 19665 1727204170.09478: exiting _queue_task() for managed-node3/gather_facts 19665 1727204170.09490: done queuing things up, now waiting for results queue to drain 19665 1727204170.09492: waiting for pending results... 19665 1727204170.09957: running TaskExecutor() for managed-node3/TASK: Gathering Facts 19665 1727204170.10282: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000002b5 19665 1727204170.10387: variable 'ansible_search_path' from source: unknown 19665 1727204170.10424: calling self._execute() 19665 1727204170.10523: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204170.10533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204170.10549: variable 'omit' from source: magic vars 19665 1727204170.10920: variable 'ansible_distribution_major_version' from source: facts 19665 1727204170.11100: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204170.11112: variable 'omit' from source: magic vars 19665 1727204170.11143: variable 'omit' from source: magic vars 19665 1727204170.11189: variable 'omit' from source: magic vars 19665 1727204170.11234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204170.11274: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204170.11303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204170.11326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204170.11342: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204170.11379: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204170.11389: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204170.11396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204170.11496: Set connection var ansible_connection to ssh 19665 1727204170.11510: Set connection var ansible_shell_type to sh 19665 1727204170.11522: Set connection var ansible_timeout to 10 19665 1727204170.11532: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204170.11544: Set connection var ansible_shell_executable to /bin/sh 19665 1727204170.11554: Set connection var ansible_pipelining to False 19665 1727204170.11579: variable 'ansible_shell_executable' from source: unknown 19665 1727204170.11639: variable 'ansible_connection' from source: unknown 19665 1727204170.11647: variable 'ansible_module_compression' from source: unknown 19665 1727204170.11653: variable 'ansible_shell_type' from source: unknown 19665 1727204170.11659: variable 'ansible_shell_executable' from source: unknown 19665 1727204170.11665: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204170.11672: variable 'ansible_pipelining' from source: unknown 19665 1727204170.11677: variable 'ansible_timeout' from source: unknown 19665 1727204170.11682: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204170.11862: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204170.11882: variable 'omit' from source: magic vars 19665 1727204170.11891: starting attempt loop 19665 1727204170.11898: running the handler 19665 1727204170.11918: variable 'ansible_facts' from source: unknown 19665 1727204170.11943: _low_level_execute_command(): starting 19665 1727204170.11955: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204170.13313: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204170.13520: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204170.13545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204170.13568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204170.13611: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204170.13625: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204170.13639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204170.13658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204170.13673: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204170.13684: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204170.13696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204170.13711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204170.13726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204170.13739: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204170.13751: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204170.13767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204170.13841: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204170.13867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204170.13884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204170.13965: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204170.15580: stdout chunk (state=3): >>>/root <<< 19665 1727204170.15770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204170.15773: stdout chunk (state=3): >>><<< 19665 1727204170.15789: stderr chunk (state=3): >>><<< 19665 1727204170.15908: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204170.15912: _low_level_execute_command(): starting 19665 1727204170.15915: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204170.1581116-21370-140583397460251 `" && echo ansible-tmp-1727204170.1581116-21370-140583397460251="` echo /root/.ansible/tmp/ansible-tmp-1727204170.1581116-21370-140583397460251 `" ) && sleep 0' 19665 1727204170.16830: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204170.16846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204170.16861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204170.16883: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204170.16931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204170.16944: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204170.16960: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204170.16986: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204170.16999: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204170.17011: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204170.17024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204170.17044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204170.17061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204170.17076: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204170.17089: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204170.17104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204170.17188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204170.17212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204170.17232: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204170.17305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204170.19134: stdout chunk (state=3): >>>ansible-tmp-1727204170.1581116-21370-140583397460251=/root/.ansible/tmp/ansible-tmp-1727204170.1581116-21370-140583397460251 <<< 19665 1727204170.19249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204170.19347: stderr chunk (state=3): >>><<< 19665 1727204170.19359: stdout chunk (state=3): >>><<< 19665 1727204170.19575: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204170.1581116-21370-140583397460251=/root/.ansible/tmp/ansible-tmp-1727204170.1581116-21370-140583397460251 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204170.19578: variable 'ansible_module_compression' from source: unknown 19665 1727204170.19581: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19665 1727204170.19583: variable 'ansible_facts' from source: unknown 19665 1727204170.19748: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204170.1581116-21370-140583397460251/AnsiballZ_setup.py 19665 1727204170.20087: Sending initial data 19665 1727204170.20096: Sent initial data (154 bytes) 19665 1727204170.20895: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204170.20908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204170.20922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204170.20938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204170.20980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204170.20991: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204170.21004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204170.21020: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204170.21033: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204170.21044: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204170.21055: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204170.21070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204170.21085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204170.21097: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204170.21106: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204170.21118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204170.21195: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204170.21212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204170.21227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204170.21305: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204170.23056: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204170.23094: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204170.23137: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp5ypdrq9t /root/.ansible/tmp/ansible-tmp-1727204170.1581116-21370-140583397460251/AnsiballZ_setup.py <<< 19665 1727204170.23181: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204170.25707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204170.25770: stderr chunk (state=3): >>><<< 19665 1727204170.25773: stdout chunk (state=3): >>><<< 19665 1727204170.25775: done transferring module to remote 19665 1727204170.25780: _low_level_execute_command(): starting 19665 1727204170.25783: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204170.1581116-21370-140583397460251/ /root/.ansible/tmp/ansible-tmp-1727204170.1581116-21370-140583397460251/AnsiballZ_setup.py && sleep 0' 19665 1727204170.27429: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204170.27450: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204170.27468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204170.27488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204170.27531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204170.27579: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204170.27594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204170.27612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204170.27626: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204170.27640: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204170.27653: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204170.27670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204170.27793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204170.27806: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204170.27817: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204170.27831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204170.27910: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204170.27933: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204170.27954: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204170.28030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204170.29822: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204170.29825: stdout chunk (state=3): >>><<< 19665 1727204170.29828: stderr chunk (state=3): >>><<< 19665 1727204170.29923: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204170.29926: _low_level_execute_command(): starting 19665 1727204170.29929: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204170.1581116-21370-140583397460251/AnsiballZ_setup.py && sleep 0' 19665 1727204170.30504: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204170.30517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204170.30532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204170.30552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204170.30596: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204170.30607: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204170.30620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204170.30637: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204170.30651: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204170.30661: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204170.30679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204170.30691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204170.30705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204170.30716: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204170.30725: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204170.30737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204170.30818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204170.30834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204170.30851: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204170.30936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204170.82066: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkv<<< 19665 1727204170.82081: stdout chunk (state=3): >>>MFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.28, "5m": 0.32, "15m": 0.16}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "10", "epoch": "1727204170", "epoch_int": "1727204170", "date": "2024-09-24", "time": "14:56:10", "iso8601_micro": "2024-09-24T18:56:10.549575Z", "iso8601": "2024-09-24T18:56:10Z", "iso8601_basic": "20240924T145610549575", "iso8601_basic_short": "20240924T145610", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2803, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 729, "free": 2803}, "nocache": {"free": 3262, "used": 270}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 516, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282120192, "block_size": 4096, "block_total": 65519355, "block_available": 64522002, "block_used": 997353, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "ae:40:77:00:3f:d3", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19665 1727204170.83773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204170.83776: stdout chunk (state=3): >>><<< 19665 1727204170.83794: stderr chunk (state=3): >>><<< 19665 1727204170.83974: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.28, "5m": 0.32, "15m": 0.16}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_pkg_mgr": "dnf", "ansible_is_chroot": false, "ansible_fibre_channel_wwn": [], "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_apparmor": {"status": "disabled"}, "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "10", "epoch": "1727204170", "epoch_int": "1727204170", "date": "2024-09-24", "time": "14:56:10", "iso8601_micro": "2024-09-24T18:56:10.549575Z", "iso8601": "2024-09-24T18:56:10Z", "iso8601_basic": "20240924T145610549575", "iso8601_basic_short": "20240924T145610", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2803, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 729, "free": 2803}, "nocache": {"free": 3262, "used": 270}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 516, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282120192, "block_size": 4096, "block_total": 65519355, "block_available": 64522002, "block_used": 997353, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "LSR-TST-br31", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_LSR_TST_br31": {"device": "LSR-TST-br31", "macaddress": "ae:40:77:00:3f:d3", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204170.84260: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204170.1581116-21370-140583397460251/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204170.84292: _low_level_execute_command(): starting 19665 1727204170.84304: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204170.1581116-21370-140583397460251/ > /dev/null 2>&1 && sleep 0' 19665 1727204170.85343: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204170.85347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204170.85396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204170.85399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204170.85402: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204170.85404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204170.85469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204170.86408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204170.86441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204170.88252: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204170.88336: stderr chunk (state=3): >>><<< 19665 1727204170.88339: stdout chunk (state=3): >>><<< 19665 1727204170.88474: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204170.88478: handler run complete 19665 1727204170.88583: variable 'ansible_facts' from source: unknown 19665 1727204170.88645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204170.88997: variable 'ansible_facts' from source: unknown 19665 1727204170.89276: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204170.89570: attempt loop complete, returning result 19665 1727204170.89581: _execute() done 19665 1727204170.89590: dumping result to json 19665 1727204170.89630: done dumping result, returning 19665 1727204170.89684: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-0dcc-3ea6-0000000002b5] 19665 1727204170.89696: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000002b5 ok: [managed-node3] 19665 1727204170.90662: no more pending results, returning what we have 19665 1727204170.90667: results queue empty 19665 1727204170.90668: checking for any_errors_fatal 19665 1727204170.90670: done checking for any_errors_fatal 19665 1727204170.90670: checking for max_fail_percentage 19665 1727204170.90673: done checking for max_fail_percentage 19665 1727204170.90674: checking to see if all hosts have failed and the running result is not ok 19665 1727204170.90675: done checking to see if all hosts have failed 19665 1727204170.90675: getting the remaining hosts for this loop 19665 1727204170.90677: done getting the remaining hosts for this loop 19665 1727204170.90682: getting the next task for host managed-node3 19665 1727204170.90688: done getting next task for host managed-node3 19665 1727204170.90690: ^ task is: TASK: meta (flush_handlers) 19665 1727204170.90692: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204170.90696: getting variables 19665 1727204170.90697: in VariableManager get_vars() 19665 1727204170.90730: Calling all_inventory to load vars for managed-node3 19665 1727204170.90733: Calling groups_inventory to load vars for managed-node3 19665 1727204170.90735: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204170.90747: Calling all_plugins_play to load vars for managed-node3 19665 1727204170.90750: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204170.90753: Calling groups_plugins_play to load vars for managed-node3 19665 1727204170.91534: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000002b5 19665 1727204170.91538: WORKER PROCESS EXITING 19665 1727204170.93991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204170.97498: done with get_vars() 19665 1727204170.97644: done getting variables 19665 1727204170.97716: in VariableManager get_vars() 19665 1727204170.97845: Calling all_inventory to load vars for managed-node3 19665 1727204170.97848: Calling groups_inventory to load vars for managed-node3 19665 1727204170.97850: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204170.97856: Calling all_plugins_play to load vars for managed-node3 19665 1727204170.97858: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204170.97861: Calling groups_plugins_play to load vars for managed-node3 19665 1727204171.00422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204171.04508: done with get_vars() 19665 1727204171.04552: done queuing things up, now waiting for results queue to drain 19665 1727204171.04555: results queue empty 19665 1727204171.04556: checking for any_errors_fatal 19665 1727204171.04560: done checking for any_errors_fatal 19665 1727204171.04561: checking for max_fail_percentage 19665 1727204171.04562: done checking for max_fail_percentage 19665 1727204171.04791: checking to see if all hosts have failed and the running result is not ok 19665 1727204171.04793: done checking to see if all hosts have failed 19665 1727204171.04794: getting the remaining hosts for this loop 19665 1727204171.04795: done getting the remaining hosts for this loop 19665 1727204171.04799: getting the next task for host managed-node3 19665 1727204171.04803: done getting next task for host managed-node3 19665 1727204171.04806: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19665 1727204171.04807: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204171.04819: getting variables 19665 1727204171.04820: in VariableManager get_vars() 19665 1727204171.04840: Calling all_inventory to load vars for managed-node3 19665 1727204171.04843: Calling groups_inventory to load vars for managed-node3 19665 1727204171.04844: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204171.04850: Calling all_plugins_play to load vars for managed-node3 19665 1727204171.04852: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204171.04855: Calling groups_plugins_play to load vars for managed-node3 19665 1727204171.19441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204171.22537: done with get_vars() 19665 1727204171.22573: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:56:11 -0400 (0:00:01.136) 0:00:22.094 ***** 19665 1727204171.22774: entering _queue_task() for managed-node3/include_tasks 19665 1727204171.23867: worker is 1 (out of 1 available) 19665 1727204171.23880: exiting _queue_task() for managed-node3/include_tasks 19665 1727204171.23892: done queuing things up, now waiting for results queue to drain 19665 1727204171.23894: waiting for pending results... 19665 1727204171.24187: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19665 1727204171.24314: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000003a 19665 1727204171.24343: variable 'ansible_search_path' from source: unknown 19665 1727204171.24351: variable 'ansible_search_path' from source: unknown 19665 1727204171.24394: calling self._execute() 19665 1727204171.25214: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204171.25227: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204171.25242: variable 'omit' from source: magic vars 19665 1727204171.25968: variable 'ansible_distribution_major_version' from source: facts 19665 1727204171.25987: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204171.25996: _execute() done 19665 1727204171.26004: dumping result to json 19665 1727204171.26010: done dumping result, returning 19665 1727204171.26023: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-0dcc-3ea6-00000000003a] 19665 1727204171.26032: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000003a 19665 1727204171.26159: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000003a 19665 1727204171.26172: WORKER PROCESS EXITING 19665 1727204171.26215: no more pending results, returning what we have 19665 1727204171.26220: in VariableManager get_vars() 19665 1727204171.26270: Calling all_inventory to load vars for managed-node3 19665 1727204171.26273: Calling groups_inventory to load vars for managed-node3 19665 1727204171.26277: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204171.26290: Calling all_plugins_play to load vars for managed-node3 19665 1727204171.26294: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204171.26296: Calling groups_plugins_play to load vars for managed-node3 19665 1727204171.28120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204171.30887: done with get_vars() 19665 1727204171.30912: variable 'ansible_search_path' from source: unknown 19665 1727204171.30914: variable 'ansible_search_path' from source: unknown 19665 1727204171.30944: we have included files to process 19665 1727204171.30946: generating all_blocks data 19665 1727204171.30947: done generating all_blocks data 19665 1727204171.30948: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19665 1727204171.30949: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19665 1727204171.30951: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19665 1727204171.31531: done processing included file 19665 1727204171.31533: iterating over new_blocks loaded from include file 19665 1727204171.31534: in VariableManager get_vars() 19665 1727204171.31559: done with get_vars() 19665 1727204171.31561: filtering new block on tags 19665 1727204171.31580: done filtering new block on tags 19665 1727204171.31584: in VariableManager get_vars() 19665 1727204171.31604: done with get_vars() 19665 1727204171.31605: filtering new block on tags 19665 1727204171.31625: done filtering new block on tags 19665 1727204171.31628: in VariableManager get_vars() 19665 1727204171.31647: done with get_vars() 19665 1727204171.31649: filtering new block on tags 19665 1727204171.31672: done filtering new block on tags 19665 1727204171.31675: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 19665 1727204171.31680: extending task lists for all hosts with included blocks 19665 1727204171.33024: done extending task lists 19665 1727204171.33026: done processing included files 19665 1727204171.33027: results queue empty 19665 1727204171.33027: checking for any_errors_fatal 19665 1727204171.33030: done checking for any_errors_fatal 19665 1727204171.33030: checking for max_fail_percentage 19665 1727204171.33031: done checking for max_fail_percentage 19665 1727204171.33032: checking to see if all hosts have failed and the running result is not ok 19665 1727204171.33033: done checking to see if all hosts have failed 19665 1727204171.33034: getting the remaining hosts for this loop 19665 1727204171.33035: done getting the remaining hosts for this loop 19665 1727204171.33037: getting the next task for host managed-node3 19665 1727204171.33155: done getting next task for host managed-node3 19665 1727204171.33159: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19665 1727204171.33162: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204171.33175: getting variables 19665 1727204171.33176: in VariableManager get_vars() 19665 1727204171.33191: Calling all_inventory to load vars for managed-node3 19665 1727204171.33194: Calling groups_inventory to load vars for managed-node3 19665 1727204171.33196: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204171.33201: Calling all_plugins_play to load vars for managed-node3 19665 1727204171.33203: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204171.33206: Calling groups_plugins_play to load vars for managed-node3 19665 1727204171.36215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204171.38647: done with get_vars() 19665 1727204171.38680: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:56:11 -0400 (0:00:00.159) 0:00:22.254 ***** 19665 1727204171.38769: entering _queue_task() for managed-node3/setup 19665 1727204171.39156: worker is 1 (out of 1 available) 19665 1727204171.39172: exiting _queue_task() for managed-node3/setup 19665 1727204171.39185: done queuing things up, now waiting for results queue to drain 19665 1727204171.39186: waiting for pending results... 19665 1727204171.39613: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19665 1727204171.39798: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000002f6 19665 1727204171.39821: variable 'ansible_search_path' from source: unknown 19665 1727204171.39828: variable 'ansible_search_path' from source: unknown 19665 1727204171.39875: calling self._execute() 19665 1727204171.39981: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204171.39997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204171.40013: variable 'omit' from source: magic vars 19665 1727204171.40416: variable 'ansible_distribution_major_version' from source: facts 19665 1727204171.40436: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204171.40670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204171.43512: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204171.44354: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204171.44407: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204171.44450: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204171.44488: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204171.44580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204171.44612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204171.44642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204171.44736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204171.44784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204171.44873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204171.44938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204171.44991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204171.45059: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204171.45086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204171.45380: variable '__network_required_facts' from source: role '' defaults 19665 1727204171.45395: variable 'ansible_facts' from source: unknown 19665 1727204171.46541: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 19665 1727204171.46557: when evaluation is False, skipping this task 19665 1727204171.46575: _execute() done 19665 1727204171.46592: dumping result to json 19665 1727204171.46616: done dumping result, returning 19665 1727204171.46636: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-0dcc-3ea6-0000000002f6] 19665 1727204171.46652: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000002f6 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204171.46845: no more pending results, returning what we have 19665 1727204171.46851: results queue empty 19665 1727204171.46852: checking for any_errors_fatal 19665 1727204171.46856: done checking for any_errors_fatal 19665 1727204171.46857: checking for max_fail_percentage 19665 1727204171.46859: done checking for max_fail_percentage 19665 1727204171.46859: checking to see if all hosts have failed and the running result is not ok 19665 1727204171.46860: done checking to see if all hosts have failed 19665 1727204171.46861: getting the remaining hosts for this loop 19665 1727204171.46865: done getting the remaining hosts for this loop 19665 1727204171.46869: getting the next task for host managed-node3 19665 1727204171.46879: done getting next task for host managed-node3 19665 1727204171.46884: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 19665 1727204171.46887: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204171.46903: getting variables 19665 1727204171.46905: in VariableManager get_vars() 19665 1727204171.46951: Calling all_inventory to load vars for managed-node3 19665 1727204171.46954: Calling groups_inventory to load vars for managed-node3 19665 1727204171.46959: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204171.46973: Calling all_plugins_play to load vars for managed-node3 19665 1727204171.46977: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204171.46981: Calling groups_plugins_play to load vars for managed-node3 19665 1727204171.48019: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000002f6 19665 1727204171.48022: WORKER PROCESS EXITING 19665 1727204171.49149: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204171.53303: done with get_vars() 19665 1727204171.53448: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:56:11 -0400 (0:00:00.149) 0:00:22.403 ***** 19665 1727204171.53674: entering _queue_task() for managed-node3/stat 19665 1727204171.54202: worker is 1 (out of 1 available) 19665 1727204171.54226: exiting _queue_task() for managed-node3/stat 19665 1727204171.54238: done queuing things up, now waiting for results queue to drain 19665 1727204171.54239: waiting for pending results... 19665 1727204171.55228: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 19665 1727204171.55517: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000002f8 19665 1727204171.55598: variable 'ansible_search_path' from source: unknown 19665 1727204171.55658: variable 'ansible_search_path' from source: unknown 19665 1727204171.55745: calling self._execute() 19665 1727204171.56022: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204171.56085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204171.56116: variable 'omit' from source: magic vars 19665 1727204171.57044: variable 'ansible_distribution_major_version' from source: facts 19665 1727204171.57095: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204171.57505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204171.58040: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204171.58153: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204171.58188: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204171.58221: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204171.58485: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204171.58689: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204171.58790: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204171.58817: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204171.59327: variable '__network_is_ostree' from source: set_fact 19665 1727204171.59334: Evaluated conditional (not __network_is_ostree is defined): False 19665 1727204171.59337: when evaluation is False, skipping this task 19665 1727204171.59340: _execute() done 19665 1727204171.59380: dumping result to json 19665 1727204171.59383: done dumping result, returning 19665 1727204171.59386: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-0dcc-3ea6-0000000002f8] 19665 1727204171.59395: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000002f8 19665 1727204171.59719: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000002f8 19665 1727204171.59721: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19665 1727204171.59789: no more pending results, returning what we have 19665 1727204171.59793: results queue empty 19665 1727204171.59794: checking for any_errors_fatal 19665 1727204171.59799: done checking for any_errors_fatal 19665 1727204171.59800: checking for max_fail_percentage 19665 1727204171.59801: done checking for max_fail_percentage 19665 1727204171.59802: checking to see if all hosts have failed and the running result is not ok 19665 1727204171.59803: done checking to see if all hosts have failed 19665 1727204171.59804: getting the remaining hosts for this loop 19665 1727204171.59806: done getting the remaining hosts for this loop 19665 1727204171.59810: getting the next task for host managed-node3 19665 1727204171.59818: done getting next task for host managed-node3 19665 1727204171.59822: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19665 1727204171.59825: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204171.59838: getting variables 19665 1727204171.59839: in VariableManager get_vars() 19665 1727204171.59886: Calling all_inventory to load vars for managed-node3 19665 1727204171.59888: Calling groups_inventory to load vars for managed-node3 19665 1727204171.59890: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204171.59902: Calling all_plugins_play to load vars for managed-node3 19665 1727204171.59905: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204171.59909: Calling groups_plugins_play to load vars for managed-node3 19665 1727204171.61955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204171.64075: done with get_vars() 19665 1727204171.64121: done getting variables 19665 1727204171.64186: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:56:11 -0400 (0:00:00.105) 0:00:22.508 ***** 19665 1727204171.64246: entering _queue_task() for managed-node3/set_fact 19665 1727204171.64813: worker is 1 (out of 1 available) 19665 1727204171.64828: exiting _queue_task() for managed-node3/set_fact 19665 1727204171.64839: done queuing things up, now waiting for results queue to drain 19665 1727204171.64841: waiting for pending results... 19665 1727204171.65854: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19665 1727204171.66365: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000002f9 19665 1727204171.66378: variable 'ansible_search_path' from source: unknown 19665 1727204171.66382: variable 'ansible_search_path' from source: unknown 19665 1727204171.66417: calling self._execute() 19665 1727204171.66951: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204171.66991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204171.67001: variable 'omit' from source: magic vars 19665 1727204171.67514: variable 'ansible_distribution_major_version' from source: facts 19665 1727204171.67525: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204171.67830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204171.68435: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204171.68531: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204171.68606: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204171.68643: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204171.68991: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204171.68994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204171.68997: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204171.69002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204171.69004: variable '__network_is_ostree' from source: set_fact 19665 1727204171.69034: Evaluated conditional (not __network_is_ostree is defined): False 19665 1727204171.69037: when evaluation is False, skipping this task 19665 1727204171.69040: _execute() done 19665 1727204171.69044: dumping result to json 19665 1727204171.69047: done dumping result, returning 19665 1727204171.69083: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-0dcc-3ea6-0000000002f9] 19665 1727204171.69086: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000002f9 19665 1727204171.69219: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000002f9 19665 1727204171.69222: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19665 1727204171.69276: no more pending results, returning what we have 19665 1727204171.69280: results queue empty 19665 1727204171.69281: checking for any_errors_fatal 19665 1727204171.69288: done checking for any_errors_fatal 19665 1727204171.69288: checking for max_fail_percentage 19665 1727204171.69290: done checking for max_fail_percentage 19665 1727204171.69291: checking to see if all hosts have failed and the running result is not ok 19665 1727204171.69292: done checking to see if all hosts have failed 19665 1727204171.69292: getting the remaining hosts for this loop 19665 1727204171.69294: done getting the remaining hosts for this loop 19665 1727204171.69298: getting the next task for host managed-node3 19665 1727204171.69307: done getting next task for host managed-node3 19665 1727204171.69311: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 19665 1727204171.69314: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204171.69327: getting variables 19665 1727204171.69332: in VariableManager get_vars() 19665 1727204171.69375: Calling all_inventory to load vars for managed-node3 19665 1727204171.69378: Calling groups_inventory to load vars for managed-node3 19665 1727204171.69380: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204171.69391: Calling all_plugins_play to load vars for managed-node3 19665 1727204171.69393: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204171.69396: Calling groups_plugins_play to load vars for managed-node3 19665 1727204171.72044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204171.74788: done with get_vars() 19665 1727204171.74819: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:56:11 -0400 (0:00:00.106) 0:00:22.615 ***** 19665 1727204171.74924: entering _queue_task() for managed-node3/service_facts 19665 1727204171.75310: worker is 1 (out of 1 available) 19665 1727204171.75324: exiting _queue_task() for managed-node3/service_facts 19665 1727204171.75336: done queuing things up, now waiting for results queue to drain 19665 1727204171.75340: waiting for pending results... 19665 1727204171.75650: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 19665 1727204171.75820: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000002fb 19665 1727204171.75849: variable 'ansible_search_path' from source: unknown 19665 1727204171.75858: variable 'ansible_search_path' from source: unknown 19665 1727204171.75903: calling self._execute() 19665 1727204171.76012: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204171.76024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204171.76058: variable 'omit' from source: magic vars 19665 1727204171.76446: variable 'ansible_distribution_major_version' from source: facts 19665 1727204171.76466: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204171.76478: variable 'omit' from source: magic vars 19665 1727204171.76544: variable 'omit' from source: magic vars 19665 1727204171.76582: variable 'omit' from source: magic vars 19665 1727204171.76631: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204171.76681: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204171.76713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204171.76735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204171.76751: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204171.76786: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204171.76794: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204171.76800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204171.76907: Set connection var ansible_connection to ssh 19665 1727204171.76925: Set connection var ansible_shell_type to sh 19665 1727204171.76936: Set connection var ansible_timeout to 10 19665 1727204171.76945: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204171.76956: Set connection var ansible_shell_executable to /bin/sh 19665 1727204171.76971: Set connection var ansible_pipelining to False 19665 1727204171.76995: variable 'ansible_shell_executable' from source: unknown 19665 1727204171.77002: variable 'ansible_connection' from source: unknown 19665 1727204171.77008: variable 'ansible_module_compression' from source: unknown 19665 1727204171.77013: variable 'ansible_shell_type' from source: unknown 19665 1727204171.77018: variable 'ansible_shell_executable' from source: unknown 19665 1727204171.77024: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204171.77034: variable 'ansible_pipelining' from source: unknown 19665 1727204171.77041: variable 'ansible_timeout' from source: unknown 19665 1727204171.77048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204171.77937: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204171.77954: variable 'omit' from source: magic vars 19665 1727204171.78015: starting attempt loop 19665 1727204171.78024: running the handler 19665 1727204171.78041: _low_level_execute_command(): starting 19665 1727204171.78053: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204171.80049: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204171.80072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204171.80090: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204171.80108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204171.80151: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204171.80191: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204171.80208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204171.80228: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204171.80298: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204171.80312: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204171.80325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204171.80338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204171.80353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204171.80366: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204171.80378: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204171.80397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204171.80476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204171.80542: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204171.80610: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204171.80696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204171.82332: stdout chunk (state=3): >>>/root <<< 19665 1727204171.82530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204171.82533: stdout chunk (state=3): >>><<< 19665 1727204171.82535: stderr chunk (state=3): >>><<< 19665 1727204171.82656: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204171.82661: _low_level_execute_command(): starting 19665 1727204171.82667: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204171.82555-21573-218206554638783 `" && echo ansible-tmp-1727204171.82555-21573-218206554638783="` echo /root/.ansible/tmp/ansible-tmp-1727204171.82555-21573-218206554638783 `" ) && sleep 0' 19665 1727204171.83972: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204171.83976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204171.84213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204171.84216: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204171.84219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204171.84283: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204171.84499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204171.84544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204171.86430: stdout chunk (state=3): >>>ansible-tmp-1727204171.82555-21573-218206554638783=/root/.ansible/tmp/ansible-tmp-1727204171.82555-21573-218206554638783 <<< 19665 1727204171.86538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204171.86628: stderr chunk (state=3): >>><<< 19665 1727204171.86632: stdout chunk (state=3): >>><<< 19665 1727204171.86873: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204171.82555-21573-218206554638783=/root/.ansible/tmp/ansible-tmp-1727204171.82555-21573-218206554638783 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204171.86877: variable 'ansible_module_compression' from source: unknown 19665 1727204171.86880: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 19665 1727204171.86882: variable 'ansible_facts' from source: unknown 19665 1727204171.86884: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204171.82555-21573-218206554638783/AnsiballZ_service_facts.py 19665 1727204171.87497: Sending initial data 19665 1727204171.87500: Sent initial data (160 bytes) 19665 1727204171.90135: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204171.90161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204171.90180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204171.90198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204171.90240: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204171.90375: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204171.90390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204171.90408: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204171.90421: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204171.90432: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204171.90443: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204171.90456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204171.90478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204171.90490: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204171.90500: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204171.90513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204171.90595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204171.90672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204171.90689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204171.90773: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204171.92567: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204171.92611: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204171.92655: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpgfx8yt35 /root/.ansible/tmp/ansible-tmp-1727204171.82555-21573-218206554638783/AnsiballZ_service_facts.py <<< 19665 1727204171.92696: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204171.94070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204171.94156: stderr chunk (state=3): >>><<< 19665 1727204171.94159: stdout chunk (state=3): >>><<< 19665 1727204171.94182: done transferring module to remote 19665 1727204171.94193: _low_level_execute_command(): starting 19665 1727204171.94198: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204171.82555-21573-218206554638783/ /root/.ansible/tmp/ansible-tmp-1727204171.82555-21573-218206554638783/AnsiballZ_service_facts.py && sleep 0' 19665 1727204171.96008: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204171.96023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204171.96033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204171.96052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204171.96092: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204171.96134: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204171.96181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204171.96194: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204171.96202: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204171.96209: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204171.96217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204171.96226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204171.96247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204171.96252: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204171.96260: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204171.96283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204171.96346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204171.96482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204171.96503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204171.96682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204171.98435: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204171.98439: stdout chunk (state=3): >>><<< 19665 1727204171.98450: stderr chunk (state=3): >>><<< 19665 1727204171.98480: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204171.98483: _low_level_execute_command(): starting 19665 1727204171.98488: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204171.82555-21573-218206554638783/AnsiballZ_service_facts.py && sleep 0' 19665 1727204172.00395: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204172.00399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204172.00447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204172.00455: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204172.00462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204172.00478: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204172.00484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204172.00552: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204172.00708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204172.00769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204173.30023: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-qu<<< 19665 1727204173.30037: stdout chunk (state=3): >>>it-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syst<<< 19665 1727204173.30061: stdout chunk (state=3): >>>emd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 19665 1727204173.31380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204173.31387: stderr chunk (state=3): >>><<< 19665 1727204173.31392: stdout chunk (state=3): >>><<< 19665 1727204173.31420: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204173.32174: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204171.82555-21573-218206554638783/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204173.32184: _low_level_execute_command(): starting 19665 1727204173.32189: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204171.82555-21573-218206554638783/ > /dev/null 2>&1 && sleep 0' 19665 1727204173.34044: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204173.34048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204173.34060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204173.34100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204173.34104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 19665 1727204173.34122: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204173.34127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204173.34232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204173.34302: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204173.34315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204173.34321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204173.34397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204173.36273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204173.36301: stdout chunk (state=3): >>><<< 19665 1727204173.36305: stderr chunk (state=3): >>><<< 19665 1727204173.36676: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204173.36679: handler run complete 19665 1727204173.36682: variable 'ansible_facts' from source: unknown 19665 1727204173.36705: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204173.37126: variable 'ansible_facts' from source: unknown 19665 1727204173.37247: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204173.37422: attempt loop complete, returning result 19665 1727204173.37425: _execute() done 19665 1727204173.37429: dumping result to json 19665 1727204173.37479: done dumping result, returning 19665 1727204173.37492: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-0dcc-3ea6-0000000002fb] 19665 1727204173.37495: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000002fb ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204173.38227: no more pending results, returning what we have 19665 1727204173.38230: results queue empty 19665 1727204173.38231: checking for any_errors_fatal 19665 1727204173.38240: done checking for any_errors_fatal 19665 1727204173.38241: checking for max_fail_percentage 19665 1727204173.38243: done checking for max_fail_percentage 19665 1727204173.38243: checking to see if all hosts have failed and the running result is not ok 19665 1727204173.38244: done checking to see if all hosts have failed 19665 1727204173.38245: getting the remaining hosts for this loop 19665 1727204173.38248: done getting the remaining hosts for this loop 19665 1727204173.38252: getting the next task for host managed-node3 19665 1727204173.38257: done getting next task for host managed-node3 19665 1727204173.38261: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 19665 1727204173.38263: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204173.38276: getting variables 19665 1727204173.38277: in VariableManager get_vars() 19665 1727204173.38312: Calling all_inventory to load vars for managed-node3 19665 1727204173.38315: Calling groups_inventory to load vars for managed-node3 19665 1727204173.38317: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204173.38326: Calling all_plugins_play to load vars for managed-node3 19665 1727204173.38328: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204173.38331: Calling groups_plugins_play to load vars for managed-node3 19665 1727204173.38991: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000002fb 19665 1727204173.38995: WORKER PROCESS EXITING 19665 1727204173.40162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204173.42152: done with get_vars() 19665 1727204173.42180: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:56:13 -0400 (0:00:01.673) 0:00:24.289 ***** 19665 1727204173.42255: entering _queue_task() for managed-node3/package_facts 19665 1727204173.42497: worker is 1 (out of 1 available) 19665 1727204173.42512: exiting _queue_task() for managed-node3/package_facts 19665 1727204173.42525: done queuing things up, now waiting for results queue to drain 19665 1727204173.42527: waiting for pending results... 19665 1727204173.42706: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 19665 1727204173.42800: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000002fc 19665 1727204173.42819: variable 'ansible_search_path' from source: unknown 19665 1727204173.42824: variable 'ansible_search_path' from source: unknown 19665 1727204173.42864: calling self._execute() 19665 1727204173.42961: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204173.42977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204173.42991: variable 'omit' from source: magic vars 19665 1727204173.43351: variable 'ansible_distribution_major_version' from source: facts 19665 1727204173.43373: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204173.43385: variable 'omit' from source: magic vars 19665 1727204173.43444: variable 'omit' from source: magic vars 19665 1727204173.43486: variable 'omit' from source: magic vars 19665 1727204173.43668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204173.43807: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204173.43833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204173.43857: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204173.43876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204173.43909: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204173.43917: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204173.43925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204173.44026: Set connection var ansible_connection to ssh 19665 1727204173.44040: Set connection var ansible_shell_type to sh 19665 1727204173.44053: Set connection var ansible_timeout to 10 19665 1727204173.44063: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204173.44080: Set connection var ansible_shell_executable to /bin/sh 19665 1727204173.44091: Set connection var ansible_pipelining to False 19665 1727204173.44118: variable 'ansible_shell_executable' from source: unknown 19665 1727204173.44126: variable 'ansible_connection' from source: unknown 19665 1727204173.44132: variable 'ansible_module_compression' from source: unknown 19665 1727204173.44138: variable 'ansible_shell_type' from source: unknown 19665 1727204173.44145: variable 'ansible_shell_executable' from source: unknown 19665 1727204173.44152: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204173.44159: variable 'ansible_pipelining' from source: unknown 19665 1727204173.44168: variable 'ansible_timeout' from source: unknown 19665 1727204173.44176: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204173.44370: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204173.44980: variable 'omit' from source: magic vars 19665 1727204173.44990: starting attempt loop 19665 1727204173.44997: running the handler 19665 1727204173.45014: _low_level_execute_command(): starting 19665 1727204173.45027: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204173.46120: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204173.46128: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204173.46142: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204173.46154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204173.46193: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204173.46199: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204173.46210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204173.46223: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204173.46230: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204173.46236: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204173.46245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204173.46254: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204173.46267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204173.46275: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204173.46282: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204173.46291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204173.46361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204173.46381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204173.46391: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204173.46475: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204173.48095: stdout chunk (state=3): >>>/root <<< 19665 1727204173.48205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204173.48369: stderr chunk (state=3): >>><<< 19665 1727204173.48389: stdout chunk (state=3): >>><<< 19665 1727204173.48484: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204173.48489: _low_level_execute_command(): starting 19665 1727204173.48493: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204173.4843538-21666-97655831952464 `" && echo ansible-tmp-1727204173.4843538-21666-97655831952464="` echo /root/.ansible/tmp/ansible-tmp-1727204173.4843538-21666-97655831952464 `" ) && sleep 0' 19665 1727204173.49481: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204173.49505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204173.49523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204173.49558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204173.49621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204173.49644: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204173.49678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204173.49709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204173.49726: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204173.49749: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204173.49777: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204173.49810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204173.49833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204173.49853: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204173.49880: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204173.49915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204173.50033: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204173.50062: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204173.50086: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204173.50169: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204173.52092: stdout chunk (state=3): >>>ansible-tmp-1727204173.4843538-21666-97655831952464=/root/.ansible/tmp/ansible-tmp-1727204173.4843538-21666-97655831952464 <<< 19665 1727204173.52211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204173.52315: stderr chunk (state=3): >>><<< 19665 1727204173.52332: stdout chunk (state=3): >>><<< 19665 1727204173.52476: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204173.4843538-21666-97655831952464=/root/.ansible/tmp/ansible-tmp-1727204173.4843538-21666-97655831952464 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204173.52480: variable 'ansible_module_compression' from source: unknown 19665 1727204173.52585: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 19665 1727204173.52588: variable 'ansible_facts' from source: unknown 19665 1727204173.52777: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204173.4843538-21666-97655831952464/AnsiballZ_package_facts.py 19665 1727204173.52975: Sending initial data 19665 1727204173.52978: Sent initial data (161 bytes) 19665 1727204173.54062: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204173.54082: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204173.54098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204173.54128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204173.54325: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204173.54384: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204173.54419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204173.54509: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204173.54522: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204173.54570: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204173.54607: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204173.54647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204173.54663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204173.54682: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204173.54702: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204173.54734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204173.54926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204173.54950: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204173.54969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204173.55088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204173.56859: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204173.56923: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204173.56943: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpstp_2rol /root/.ansible/tmp/ansible-tmp-1727204173.4843538-21666-97655831952464/AnsiballZ_package_facts.py <<< 19665 1727204173.56975: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204173.59763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204173.60090: stderr chunk (state=3): >>><<< 19665 1727204173.60096: stdout chunk (state=3): >>><<< 19665 1727204173.60099: done transferring module to remote 19665 1727204173.60108: _low_level_execute_command(): starting 19665 1727204173.60111: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204173.4843538-21666-97655831952464/ /root/.ansible/tmp/ansible-tmp-1727204173.4843538-21666-97655831952464/AnsiballZ_package_facts.py && sleep 0' 19665 1727204173.61373: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204173.61437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204173.61454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204173.61693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204173.61738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204173.61868: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204173.61888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204173.61908: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204173.61919: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204173.61931: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204173.61944: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204173.61961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204173.61983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204173.62000: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204173.62013: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204173.62027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204173.62136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204173.62178: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204173.62199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204173.62276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204173.64152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204173.64156: stdout chunk (state=3): >>><<< 19665 1727204173.64159: stderr chunk (state=3): >>><<< 19665 1727204173.64268: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204173.64271: _low_level_execute_command(): starting 19665 1727204173.64274: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204173.4843538-21666-97655831952464/AnsiballZ_package_facts.py && sleep 0' 19665 1727204173.65603: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204173.65607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204173.65623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204173.65630: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204173.65640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204173.65656: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204173.65662: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204173.65672: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204173.65680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204173.65688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204173.65700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204173.65706: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204173.65712: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204173.65724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204173.65802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204173.65818: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204173.65832: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204173.65924: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204174.12884: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 19665 1727204174.14296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204174.14395: stderr chunk (state=3): >>>Shared connection to 10.31.15.87 closed. <<< 19665 1727204174.14460: stderr chunk (state=3): >>><<< 19665 1727204174.14466: stdout chunk (state=3): >>><<< 19665 1727204174.14673: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204174.17817: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204173.4843538-21666-97655831952464/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204174.17852: _low_level_execute_command(): starting 19665 1727204174.17866: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204173.4843538-21666-97655831952464/ > /dev/null 2>&1 && sleep 0' 19665 1727204174.19329: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204174.19347: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204174.19361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204174.19384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204174.19428: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204174.19444: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204174.19460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204174.19480: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204174.19492: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204174.19503: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204174.19514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204174.19526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204174.19545: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204174.19556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204174.19570: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204174.19585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204174.19666: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204174.19690: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204174.19712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204174.19793: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204174.21694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204174.21698: stdout chunk (state=3): >>><<< 19665 1727204174.21700: stderr chunk (state=3): >>><<< 19665 1727204174.21813: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204174.21817: handler run complete 19665 1727204174.22751: variable 'ansible_facts' from source: unknown 19665 1727204174.23593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204174.28091: variable 'ansible_facts' from source: unknown 19665 1727204174.29590: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204174.30560: attempt loop complete, returning result 19665 1727204174.30589: _execute() done 19665 1727204174.30598: dumping result to json 19665 1727204174.30863: done dumping result, returning 19665 1727204174.30886: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-0dcc-3ea6-0000000002fc] 19665 1727204174.30898: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000002fc 19665 1727204174.34847: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000002fc 19665 1727204174.34850: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204174.35009: no more pending results, returning what we have 19665 1727204174.35012: results queue empty 19665 1727204174.35013: checking for any_errors_fatal 19665 1727204174.35018: done checking for any_errors_fatal 19665 1727204174.35019: checking for max_fail_percentage 19665 1727204174.35020: done checking for max_fail_percentage 19665 1727204174.35021: checking to see if all hosts have failed and the running result is not ok 19665 1727204174.35022: done checking to see if all hosts have failed 19665 1727204174.35023: getting the remaining hosts for this loop 19665 1727204174.35024: done getting the remaining hosts for this loop 19665 1727204174.35028: getting the next task for host managed-node3 19665 1727204174.35034: done getting next task for host managed-node3 19665 1727204174.35040: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 19665 1727204174.35042: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204174.35051: getting variables 19665 1727204174.35052: in VariableManager get_vars() 19665 1727204174.35104: Calling all_inventory to load vars for managed-node3 19665 1727204174.35108: Calling groups_inventory to load vars for managed-node3 19665 1727204174.35110: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204174.35120: Calling all_plugins_play to load vars for managed-node3 19665 1727204174.35122: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204174.35125: Calling groups_plugins_play to load vars for managed-node3 19665 1727204174.37124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204174.40899: done with get_vars() 19665 1727204174.41916: done getting variables 19665 1727204174.41992: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:56:14 -0400 (0:00:00.997) 0:00:25.286 ***** 19665 1727204174.42035: entering _queue_task() for managed-node3/debug 19665 1727204174.42427: worker is 1 (out of 1 available) 19665 1727204174.42444: exiting _queue_task() for managed-node3/debug 19665 1727204174.42457: done queuing things up, now waiting for results queue to drain 19665 1727204174.42459: waiting for pending results... 19665 1727204174.43240: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 19665 1727204174.43371: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000003b 19665 1727204174.43395: variable 'ansible_search_path' from source: unknown 19665 1727204174.43402: variable 'ansible_search_path' from source: unknown 19665 1727204174.43446: calling self._execute() 19665 1727204174.43562: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204174.43579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204174.43599: variable 'omit' from source: magic vars 19665 1727204174.44010: variable 'ansible_distribution_major_version' from source: facts 19665 1727204174.44038: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204174.44051: variable 'omit' from source: magic vars 19665 1727204174.44100: variable 'omit' from source: magic vars 19665 1727204174.44213: variable 'network_provider' from source: set_fact 19665 1727204174.44238: variable 'omit' from source: magic vars 19665 1727204174.44291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204174.44332: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204174.44668: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204174.44694: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204174.44711: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204174.44747: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204174.44755: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204174.44762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204174.44870: Set connection var ansible_connection to ssh 19665 1727204174.44889: Set connection var ansible_shell_type to sh 19665 1727204174.44905: Set connection var ansible_timeout to 10 19665 1727204174.44916: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204174.44927: Set connection var ansible_shell_executable to /bin/sh 19665 1727204174.44940: Set connection var ansible_pipelining to False 19665 1727204174.46289: variable 'ansible_shell_executable' from source: unknown 19665 1727204174.46376: variable 'ansible_connection' from source: unknown 19665 1727204174.46384: variable 'ansible_module_compression' from source: unknown 19665 1727204174.46391: variable 'ansible_shell_type' from source: unknown 19665 1727204174.46396: variable 'ansible_shell_executable' from source: unknown 19665 1727204174.46402: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204174.46413: variable 'ansible_pipelining' from source: unknown 19665 1727204174.46420: variable 'ansible_timeout' from source: unknown 19665 1727204174.46428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204174.46572: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204174.47284: variable 'omit' from source: magic vars 19665 1727204174.47295: starting attempt loop 19665 1727204174.47303: running the handler 19665 1727204174.47352: handler run complete 19665 1727204174.47373: attempt loop complete, returning result 19665 1727204174.47379: _execute() done 19665 1727204174.47389: dumping result to json 19665 1727204174.47395: done dumping result, returning 19665 1727204174.47407: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-0dcc-3ea6-00000000003b] 19665 1727204174.47415: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000003b ok: [managed-node3] => {} MSG: Using network provider: nm 19665 1727204174.47582: no more pending results, returning what we have 19665 1727204174.47585: results queue empty 19665 1727204174.47586: checking for any_errors_fatal 19665 1727204174.47597: done checking for any_errors_fatal 19665 1727204174.47598: checking for max_fail_percentage 19665 1727204174.47600: done checking for max_fail_percentage 19665 1727204174.47600: checking to see if all hosts have failed and the running result is not ok 19665 1727204174.47601: done checking to see if all hosts have failed 19665 1727204174.47602: getting the remaining hosts for this loop 19665 1727204174.47604: done getting the remaining hosts for this loop 19665 1727204174.47608: getting the next task for host managed-node3 19665 1727204174.47614: done getting next task for host managed-node3 19665 1727204174.47618: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19665 1727204174.47620: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204174.47634: getting variables 19665 1727204174.47636: in VariableManager get_vars() 19665 1727204174.47674: Calling all_inventory to load vars for managed-node3 19665 1727204174.47676: Calling groups_inventory to load vars for managed-node3 19665 1727204174.47679: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204174.47685: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000003b 19665 1727204174.47690: WORKER PROCESS EXITING 19665 1727204174.47701: Calling all_plugins_play to load vars for managed-node3 19665 1727204174.47704: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204174.47707: Calling groups_plugins_play to load vars for managed-node3 19665 1727204174.49487: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204174.51668: done with get_vars() 19665 1727204174.51708: done getting variables 19665 1727204174.51772: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:56:14 -0400 (0:00:00.097) 0:00:25.384 ***** 19665 1727204174.51810: entering _queue_task() for managed-node3/fail 19665 1727204174.52160: worker is 1 (out of 1 available) 19665 1727204174.52174: exiting _queue_task() for managed-node3/fail 19665 1727204174.52187: done queuing things up, now waiting for results queue to drain 19665 1727204174.52189: waiting for pending results... 19665 1727204174.52491: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19665 1727204174.52627: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000003c 19665 1727204174.52651: variable 'ansible_search_path' from source: unknown 19665 1727204174.52667: variable 'ansible_search_path' from source: unknown 19665 1727204174.52711: calling self._execute() 19665 1727204174.52815: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204174.52828: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204174.52843: variable 'omit' from source: magic vars 19665 1727204174.53262: variable 'ansible_distribution_major_version' from source: facts 19665 1727204174.53282: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204174.53423: variable 'network_state' from source: role '' defaults 19665 1727204174.53441: Evaluated conditional (network_state != {}): False 19665 1727204174.53449: when evaluation is False, skipping this task 19665 1727204174.53455: _execute() done 19665 1727204174.53461: dumping result to json 19665 1727204174.53469: done dumping result, returning 19665 1727204174.53479: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-0dcc-3ea6-00000000003c] 19665 1727204174.53489: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000003c 19665 1727204174.53601: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000003c 19665 1727204174.53610: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204174.53660: no more pending results, returning what we have 19665 1727204174.53667: results queue empty 19665 1727204174.53668: checking for any_errors_fatal 19665 1727204174.53679: done checking for any_errors_fatal 19665 1727204174.53680: checking for max_fail_percentage 19665 1727204174.53682: done checking for max_fail_percentage 19665 1727204174.53682: checking to see if all hosts have failed and the running result is not ok 19665 1727204174.53683: done checking to see if all hosts have failed 19665 1727204174.53684: getting the remaining hosts for this loop 19665 1727204174.53686: done getting the remaining hosts for this loop 19665 1727204174.53690: getting the next task for host managed-node3 19665 1727204174.53697: done getting next task for host managed-node3 19665 1727204174.53701: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19665 1727204174.53704: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204174.53720: getting variables 19665 1727204174.53722: in VariableManager get_vars() 19665 1727204174.53762: Calling all_inventory to load vars for managed-node3 19665 1727204174.53766: Calling groups_inventory to load vars for managed-node3 19665 1727204174.53769: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204174.53783: Calling all_plugins_play to load vars for managed-node3 19665 1727204174.53787: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204174.53791: Calling groups_plugins_play to load vars for managed-node3 19665 1727204174.55699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204174.57438: done with get_vars() 19665 1727204174.57482: done getting variables 19665 1727204174.57563: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:56:14 -0400 (0:00:00.057) 0:00:25.442 ***** 19665 1727204174.57597: entering _queue_task() for managed-node3/fail 19665 1727204174.57942: worker is 1 (out of 1 available) 19665 1727204174.57956: exiting _queue_task() for managed-node3/fail 19665 1727204174.57969: done queuing things up, now waiting for results queue to drain 19665 1727204174.57971: waiting for pending results... 19665 1727204174.58822: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19665 1727204174.59069: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000003d 19665 1727204174.59158: variable 'ansible_search_path' from source: unknown 19665 1727204174.59169: variable 'ansible_search_path' from source: unknown 19665 1727204174.59214: calling self._execute() 19665 1727204174.59358: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204174.59373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204174.59387: variable 'omit' from source: magic vars 19665 1727204174.59779: variable 'ansible_distribution_major_version' from source: facts 19665 1727204174.59796: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204174.59926: variable 'network_state' from source: role '' defaults 19665 1727204174.59940: Evaluated conditional (network_state != {}): False 19665 1727204174.59952: when evaluation is False, skipping this task 19665 1727204174.59958: _execute() done 19665 1727204174.59967: dumping result to json 19665 1727204174.59975: done dumping result, returning 19665 1727204174.59985: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-0dcc-3ea6-00000000003d] 19665 1727204174.59995: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000003d skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204174.60145: no more pending results, returning what we have 19665 1727204174.60149: results queue empty 19665 1727204174.60150: checking for any_errors_fatal 19665 1727204174.60159: done checking for any_errors_fatal 19665 1727204174.60160: checking for max_fail_percentage 19665 1727204174.60162: done checking for max_fail_percentage 19665 1727204174.60162: checking to see if all hosts have failed and the running result is not ok 19665 1727204174.60163: done checking to see if all hosts have failed 19665 1727204174.60166: getting the remaining hosts for this loop 19665 1727204174.60168: done getting the remaining hosts for this loop 19665 1727204174.60173: getting the next task for host managed-node3 19665 1727204174.60180: done getting next task for host managed-node3 19665 1727204174.60185: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19665 1727204174.60188: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204174.60203: getting variables 19665 1727204174.60205: in VariableManager get_vars() 19665 1727204174.60244: Calling all_inventory to load vars for managed-node3 19665 1727204174.60247: Calling groups_inventory to load vars for managed-node3 19665 1727204174.60250: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204174.60262: Calling all_plugins_play to load vars for managed-node3 19665 1727204174.60266: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204174.60270: Calling groups_plugins_play to load vars for managed-node3 19665 1727204174.61331: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000003d 19665 1727204174.61334: WORKER PROCESS EXITING 19665 1727204174.62060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204174.63869: done with get_vars() 19665 1727204174.63896: done getting variables 19665 1727204174.63960: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:56:14 -0400 (0:00:00.063) 0:00:25.506 ***** 19665 1727204174.63994: entering _queue_task() for managed-node3/fail 19665 1727204174.64335: worker is 1 (out of 1 available) 19665 1727204174.64348: exiting _queue_task() for managed-node3/fail 19665 1727204174.64361: done queuing things up, now waiting for results queue to drain 19665 1727204174.64362: waiting for pending results... 19665 1727204174.64649: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19665 1727204174.64768: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000003e 19665 1727204174.64789: variable 'ansible_search_path' from source: unknown 19665 1727204174.64796: variable 'ansible_search_path' from source: unknown 19665 1727204174.64844: calling self._execute() 19665 1727204174.64952: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204174.64963: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204174.64979: variable 'omit' from source: magic vars 19665 1727204174.65367: variable 'ansible_distribution_major_version' from source: facts 19665 1727204174.65387: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204174.65572: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204174.69012: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204174.69099: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204174.69150: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204174.69192: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204174.69226: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204174.69314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204174.69349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204174.69384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204174.69432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204174.69453: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204174.69566: variable 'ansible_distribution_major_version' from source: facts 19665 1727204174.69590: Evaluated conditional (ansible_distribution_major_version | int > 9): False 19665 1727204174.69600: when evaluation is False, skipping this task 19665 1727204174.69607: _execute() done 19665 1727204174.69614: dumping result to json 19665 1727204174.69624: done dumping result, returning 19665 1727204174.69637: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-0dcc-3ea6-00000000003e] 19665 1727204174.69648: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000003e 19665 1727204174.69763: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000003e 19665 1727204174.69774: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 19665 1727204174.69833: no more pending results, returning what we have 19665 1727204174.69837: results queue empty 19665 1727204174.69838: checking for any_errors_fatal 19665 1727204174.69847: done checking for any_errors_fatal 19665 1727204174.69848: checking for max_fail_percentage 19665 1727204174.69850: done checking for max_fail_percentage 19665 1727204174.69850: checking to see if all hosts have failed and the running result is not ok 19665 1727204174.69851: done checking to see if all hosts have failed 19665 1727204174.69852: getting the remaining hosts for this loop 19665 1727204174.69854: done getting the remaining hosts for this loop 19665 1727204174.69859: getting the next task for host managed-node3 19665 1727204174.69868: done getting next task for host managed-node3 19665 1727204174.69873: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19665 1727204174.69875: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204174.69888: getting variables 19665 1727204174.69890: in VariableManager get_vars() 19665 1727204174.69930: Calling all_inventory to load vars for managed-node3 19665 1727204174.69933: Calling groups_inventory to load vars for managed-node3 19665 1727204174.69935: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204174.69947: Calling all_plugins_play to load vars for managed-node3 19665 1727204174.69950: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204174.69953: Calling groups_plugins_play to load vars for managed-node3 19665 1727204174.73082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204174.76740: done with get_vars() 19665 1727204174.76778: done getting variables 19665 1727204174.76842: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:56:14 -0400 (0:00:00.134) 0:00:25.641 ***** 19665 1727204174.77481: entering _queue_task() for managed-node3/dnf 19665 1727204174.77827: worker is 1 (out of 1 available) 19665 1727204174.77842: exiting _queue_task() for managed-node3/dnf 19665 1727204174.77854: done queuing things up, now waiting for results queue to drain 19665 1727204174.77856: waiting for pending results... 19665 1727204174.78773: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19665 1727204174.79080: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000003f 19665 1727204174.79100: variable 'ansible_search_path' from source: unknown 19665 1727204174.79108: variable 'ansible_search_path' from source: unknown 19665 1727204174.79151: calling self._execute() 19665 1727204174.79369: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204174.79496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204174.79512: variable 'omit' from source: magic vars 19665 1727204174.80209: variable 'ansible_distribution_major_version' from source: facts 19665 1727204174.80374: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204174.80699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204174.83659: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204174.83727: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204174.83772: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204174.83807: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204174.83835: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204174.83917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204174.83952: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204174.83987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204174.84038: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204174.84099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204174.84348: variable 'ansible_distribution' from source: facts 19665 1727204174.84398: variable 'ansible_distribution_major_version' from source: facts 19665 1727204174.84489: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 19665 1727204174.84727: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204174.85074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204174.85100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204174.85133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204174.85195: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204174.85204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204174.85259: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204174.85282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204174.85302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204174.85339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204174.85359: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204174.85401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204174.85429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204174.85455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204174.85494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204174.85507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204174.85688: variable 'network_connections' from source: play vars 19665 1727204174.85699: variable 'profile' from source: play vars 19665 1727204174.85748: variable 'profile' from source: play vars 19665 1727204174.85751: variable 'interface' from source: set_fact 19665 1727204174.85806: variable 'interface' from source: set_fact 19665 1727204174.85859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204174.86268: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204174.86297: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204174.86320: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204174.86345: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204174.86380: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204174.86396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204174.86418: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204174.86438: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204174.86478: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204174.86633: variable 'network_connections' from source: play vars 19665 1727204174.86636: variable 'profile' from source: play vars 19665 1727204174.86687: variable 'profile' from source: play vars 19665 1727204174.86691: variable 'interface' from source: set_fact 19665 1727204174.86732: variable 'interface' from source: set_fact 19665 1727204174.86792: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19665 1727204174.86795: when evaluation is False, skipping this task 19665 1727204174.86797: _execute() done 19665 1727204174.86804: dumping result to json 19665 1727204174.86806: done dumping result, returning 19665 1727204174.86808: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-0dcc-3ea6-00000000003f] 19665 1727204174.86810: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000003f 19665 1727204174.86925: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000003f 19665 1727204174.86928: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19665 1727204174.86985: no more pending results, returning what we have 19665 1727204174.86990: results queue empty 19665 1727204174.86991: checking for any_errors_fatal 19665 1727204174.86999: done checking for any_errors_fatal 19665 1727204174.87000: checking for max_fail_percentage 19665 1727204174.87002: done checking for max_fail_percentage 19665 1727204174.87003: checking to see if all hosts have failed and the running result is not ok 19665 1727204174.87004: done checking to see if all hosts have failed 19665 1727204174.87004: getting the remaining hosts for this loop 19665 1727204174.87006: done getting the remaining hosts for this loop 19665 1727204174.87011: getting the next task for host managed-node3 19665 1727204174.87017: done getting next task for host managed-node3 19665 1727204174.87021: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19665 1727204174.87022: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204174.87035: getting variables 19665 1727204174.87036: in VariableManager get_vars() 19665 1727204174.87080: Calling all_inventory to load vars for managed-node3 19665 1727204174.87083: Calling groups_inventory to load vars for managed-node3 19665 1727204174.87085: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204174.87097: Calling all_plugins_play to load vars for managed-node3 19665 1727204174.87099: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204174.87102: Calling groups_plugins_play to load vars for managed-node3 19665 1727204174.89920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204174.91112: done with get_vars() 19665 1727204174.91135: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19665 1727204174.91196: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:56:14 -0400 (0:00:00.137) 0:00:25.778 ***** 19665 1727204174.91221: entering _queue_task() for managed-node3/yum 19665 1727204174.91596: worker is 1 (out of 1 available) 19665 1727204174.91611: exiting _queue_task() for managed-node3/yum 19665 1727204174.91624: done queuing things up, now waiting for results queue to drain 19665 1727204174.91626: waiting for pending results... 19665 1727204174.91922: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19665 1727204174.92058: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000040 19665 1727204174.92085: variable 'ansible_search_path' from source: unknown 19665 1727204174.92094: variable 'ansible_search_path' from source: unknown 19665 1727204174.92135: calling self._execute() 19665 1727204174.92247: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204174.92268: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204174.92285: variable 'omit' from source: magic vars 19665 1727204174.92692: variable 'ansible_distribution_major_version' from source: facts 19665 1727204174.92711: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204174.92893: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204174.95347: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204174.95426: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204174.95474: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204174.95514: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204174.95548: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204174.95632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204174.95668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204174.95699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204174.95744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204174.95767: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204174.95872: variable 'ansible_distribution_major_version' from source: facts 19665 1727204174.95892: Evaluated conditional (ansible_distribution_major_version | int < 8): False 19665 1727204174.95900: when evaluation is False, skipping this task 19665 1727204174.95906: _execute() done 19665 1727204174.95913: dumping result to json 19665 1727204174.95919: done dumping result, returning 19665 1727204174.95932: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-0dcc-3ea6-000000000040] 19665 1727204174.95942: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000040 19665 1727204174.96053: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000040 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 19665 1727204174.96109: no more pending results, returning what we have 19665 1727204174.96114: results queue empty 19665 1727204174.96115: checking for any_errors_fatal 19665 1727204174.96124: done checking for any_errors_fatal 19665 1727204174.96125: checking for max_fail_percentage 19665 1727204174.96127: done checking for max_fail_percentage 19665 1727204174.96128: checking to see if all hosts have failed and the running result is not ok 19665 1727204174.96129: done checking to see if all hosts have failed 19665 1727204174.96129: getting the remaining hosts for this loop 19665 1727204174.96137: done getting the remaining hosts for this loop 19665 1727204174.96141: getting the next task for host managed-node3 19665 1727204174.96148: done getting next task for host managed-node3 19665 1727204174.96154: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19665 1727204174.96156: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204174.96173: getting variables 19665 1727204174.96175: in VariableManager get_vars() 19665 1727204174.96217: Calling all_inventory to load vars for managed-node3 19665 1727204174.96220: Calling groups_inventory to load vars for managed-node3 19665 1727204174.96223: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204174.96233: Calling all_plugins_play to load vars for managed-node3 19665 1727204174.96237: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204174.96240: Calling groups_plugins_play to load vars for managed-node3 19665 1727204174.96983: WORKER PROCESS EXITING 19665 1727204174.97747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204174.99486: done with get_vars() 19665 1727204174.99524: done getting variables 19665 1727204174.99608: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:56:14 -0400 (0:00:00.084) 0:00:25.863 ***** 19665 1727204174.99656: entering _queue_task() for managed-node3/fail 19665 1727204175.00052: worker is 1 (out of 1 available) 19665 1727204175.00068: exiting _queue_task() for managed-node3/fail 19665 1727204175.00084: done queuing things up, now waiting for results queue to drain 19665 1727204175.00086: waiting for pending results... 19665 1727204175.00437: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19665 1727204175.00574: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000041 19665 1727204175.00597: variable 'ansible_search_path' from source: unknown 19665 1727204175.00611: variable 'ansible_search_path' from source: unknown 19665 1727204175.00657: calling self._execute() 19665 1727204175.00771: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204175.00783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204175.00796: variable 'omit' from source: magic vars 19665 1727204175.01215: variable 'ansible_distribution_major_version' from source: facts 19665 1727204175.01237: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204175.01369: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204175.01583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204175.03628: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204175.03717: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204175.03790: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204175.03837: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204175.03889: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204175.04022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.04067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.04105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.04159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.04191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.04255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.04296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.04337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.04387: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.04414: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.04480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.04519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.04560: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.04609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.04634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.04836: variable 'network_connections' from source: play vars 19665 1727204175.04852: variable 'profile' from source: play vars 19665 1727204175.04917: variable 'profile' from source: play vars 19665 1727204175.04920: variable 'interface' from source: set_fact 19665 1727204175.04975: variable 'interface' from source: set_fact 19665 1727204175.05048: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204175.05278: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204175.05325: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204175.05359: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204175.05389: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204175.05440: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204175.05478: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204175.05521: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.05565: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204175.05636: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204175.05932: variable 'network_connections' from source: play vars 19665 1727204175.05945: variable 'profile' from source: play vars 19665 1727204175.06028: variable 'profile' from source: play vars 19665 1727204175.06039: variable 'interface' from source: set_fact 19665 1727204175.06106: variable 'interface' from source: set_fact 19665 1727204175.06135: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19665 1727204175.06143: when evaluation is False, skipping this task 19665 1727204175.06150: _execute() done 19665 1727204175.06160: dumping result to json 19665 1727204175.06173: done dumping result, returning 19665 1727204175.06187: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0dcc-3ea6-000000000041] 19665 1727204175.06210: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000041 19665 1727204175.06309: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000041 19665 1727204175.06311: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19665 1727204175.06400: no more pending results, returning what we have 19665 1727204175.06405: results queue empty 19665 1727204175.06406: checking for any_errors_fatal 19665 1727204175.06413: done checking for any_errors_fatal 19665 1727204175.06414: checking for max_fail_percentage 19665 1727204175.06416: done checking for max_fail_percentage 19665 1727204175.06418: checking to see if all hosts have failed and the running result is not ok 19665 1727204175.06418: done checking to see if all hosts have failed 19665 1727204175.06419: getting the remaining hosts for this loop 19665 1727204175.06421: done getting the remaining hosts for this loop 19665 1727204175.06426: getting the next task for host managed-node3 19665 1727204175.06433: done getting next task for host managed-node3 19665 1727204175.06440: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 19665 1727204175.06443: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204175.06456: getting variables 19665 1727204175.06458: in VariableManager get_vars() 19665 1727204175.06507: Calling all_inventory to load vars for managed-node3 19665 1727204175.06510: Calling groups_inventory to load vars for managed-node3 19665 1727204175.06512: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204175.06524: Calling all_plugins_play to load vars for managed-node3 19665 1727204175.06528: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204175.06531: Calling groups_plugins_play to load vars for managed-node3 19665 1727204175.08018: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204175.09198: done with get_vars() 19665 1727204175.09226: done getting variables 19665 1727204175.09294: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.096) 0:00:25.959 ***** 19665 1727204175.09329: entering _queue_task() for managed-node3/package 19665 1727204175.09618: worker is 1 (out of 1 available) 19665 1727204175.09632: exiting _queue_task() for managed-node3/package 19665 1727204175.09647: done queuing things up, now waiting for results queue to drain 19665 1727204175.09649: waiting for pending results... 19665 1727204175.09896: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 19665 1727204175.10007: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000042 19665 1727204175.10028: variable 'ansible_search_path' from source: unknown 19665 1727204175.10043: variable 'ansible_search_path' from source: unknown 19665 1727204175.10095: calling self._execute() 19665 1727204175.10189: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204175.10204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204175.10208: variable 'omit' from source: magic vars 19665 1727204175.10630: variable 'ansible_distribution_major_version' from source: facts 19665 1727204175.10653: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204175.10856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204175.11122: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204175.11155: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204175.11190: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204175.11245: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204175.11403: variable 'network_packages' from source: role '' defaults 19665 1727204175.11561: variable '__network_provider_setup' from source: role '' defaults 19665 1727204175.11588: variable '__network_service_name_default_nm' from source: role '' defaults 19665 1727204175.11670: variable '__network_service_name_default_nm' from source: role '' defaults 19665 1727204175.11684: variable '__network_packages_default_nm' from source: role '' defaults 19665 1727204175.11820: variable '__network_packages_default_nm' from source: role '' defaults 19665 1727204175.11971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204175.14119: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204175.14203: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204175.14246: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204175.14287: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204175.14319: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204175.14407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.14440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.14474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.14518: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.14540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.14590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.14619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.14649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.14694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.14714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.14953: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19665 1727204175.15066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.15089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.15124: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.15150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.15173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.15256: variable 'ansible_python' from source: facts 19665 1727204175.15292: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19665 1727204175.15373: variable '__network_wpa_supplicant_required' from source: role '' defaults 19665 1727204175.15439: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19665 1727204175.15572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.15591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.15608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.15634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.15645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.15682: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.15702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.15724: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.15756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.15770: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.15865: variable 'network_connections' from source: play vars 19665 1727204175.15869: variable 'profile' from source: play vars 19665 1727204175.15971: variable 'profile' from source: play vars 19665 1727204175.15977: variable 'interface' from source: set_fact 19665 1727204175.16042: variable 'interface' from source: set_fact 19665 1727204175.16108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204175.16151: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204175.16181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.16216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204175.16285: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204175.16547: variable 'network_connections' from source: play vars 19665 1727204175.16550: variable 'profile' from source: play vars 19665 1727204175.16647: variable 'profile' from source: play vars 19665 1727204175.16663: variable 'interface' from source: set_fact 19665 1727204175.16744: variable 'interface' from source: set_fact 19665 1727204175.16785: variable '__network_packages_default_wireless' from source: role '' defaults 19665 1727204175.16934: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204175.17450: variable 'network_connections' from source: play vars 19665 1727204175.17477: variable 'profile' from source: play vars 19665 1727204175.17591: variable 'profile' from source: play vars 19665 1727204175.17601: variable 'interface' from source: set_fact 19665 1727204175.17757: variable 'interface' from source: set_fact 19665 1727204175.17815: variable '__network_packages_default_team' from source: role '' defaults 19665 1727204175.17904: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204175.18358: variable 'network_connections' from source: play vars 19665 1727204175.18383: variable 'profile' from source: play vars 19665 1727204175.18495: variable 'profile' from source: play vars 19665 1727204175.18518: variable 'interface' from source: set_fact 19665 1727204175.18723: variable 'interface' from source: set_fact 19665 1727204175.18824: variable '__network_service_name_default_initscripts' from source: role '' defaults 19665 1727204175.18956: variable '__network_service_name_default_initscripts' from source: role '' defaults 19665 1727204175.18987: variable '__network_packages_default_initscripts' from source: role '' defaults 19665 1727204175.19059: variable '__network_packages_default_initscripts' from source: role '' defaults 19665 1727204175.19442: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19665 1727204175.20006: variable 'network_connections' from source: play vars 19665 1727204175.20018: variable 'profile' from source: play vars 19665 1727204175.20091: variable 'profile' from source: play vars 19665 1727204175.20100: variable 'interface' from source: set_fact 19665 1727204175.20162: variable 'interface' from source: set_fact 19665 1727204175.20180: variable 'ansible_distribution' from source: facts 19665 1727204175.20189: variable '__network_rh_distros' from source: role '' defaults 19665 1727204175.20198: variable 'ansible_distribution_major_version' from source: facts 19665 1727204175.20213: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19665 1727204175.20376: variable 'ansible_distribution' from source: facts 19665 1727204175.20385: variable '__network_rh_distros' from source: role '' defaults 19665 1727204175.20394: variable 'ansible_distribution_major_version' from source: facts 19665 1727204175.20409: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19665 1727204175.20670: variable 'ansible_distribution' from source: facts 19665 1727204175.20701: variable '__network_rh_distros' from source: role '' defaults 19665 1727204175.20710: variable 'ansible_distribution_major_version' from source: facts 19665 1727204175.20767: variable 'network_provider' from source: set_fact 19665 1727204175.20815: variable 'ansible_facts' from source: unknown 19665 1727204175.21867: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 19665 1727204175.21875: when evaluation is False, skipping this task 19665 1727204175.21881: _execute() done 19665 1727204175.21887: dumping result to json 19665 1727204175.21893: done dumping result, returning 19665 1727204175.21904: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-0dcc-3ea6-000000000042] 19665 1727204175.21913: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000042 skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 19665 1727204175.22072: no more pending results, returning what we have 19665 1727204175.22076: results queue empty 19665 1727204175.22077: checking for any_errors_fatal 19665 1727204175.22086: done checking for any_errors_fatal 19665 1727204175.22087: checking for max_fail_percentage 19665 1727204175.22088: done checking for max_fail_percentage 19665 1727204175.22089: checking to see if all hosts have failed and the running result is not ok 19665 1727204175.22090: done checking to see if all hosts have failed 19665 1727204175.22090: getting the remaining hosts for this loop 19665 1727204175.22092: done getting the remaining hosts for this loop 19665 1727204175.22096: getting the next task for host managed-node3 19665 1727204175.22102: done getting next task for host managed-node3 19665 1727204175.22106: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19665 1727204175.22108: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204175.22125: getting variables 19665 1727204175.22127: in VariableManager get_vars() 19665 1727204175.22167: Calling all_inventory to load vars for managed-node3 19665 1727204175.22169: Calling groups_inventory to load vars for managed-node3 19665 1727204175.22172: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204175.22183: Calling all_plugins_play to load vars for managed-node3 19665 1727204175.22190: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000042 19665 1727204175.22232: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204175.22245: Calling groups_plugins_play to load vars for managed-node3 19665 1727204175.22849: WORKER PROCESS EXITING 19665 1727204175.24496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204175.27171: done with get_vars() 19665 1727204175.27201: done getting variables 19665 1727204175.27269: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.179) 0:00:26.139 ***** 19665 1727204175.27305: entering _queue_task() for managed-node3/package 19665 1727204175.28153: worker is 1 (out of 1 available) 19665 1727204175.28170: exiting _queue_task() for managed-node3/package 19665 1727204175.28241: done queuing things up, now waiting for results queue to drain 19665 1727204175.28243: waiting for pending results... 19665 1727204175.28673: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19665 1727204175.28936: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000043 19665 1727204175.28963: variable 'ansible_search_path' from source: unknown 19665 1727204175.29014: variable 'ansible_search_path' from source: unknown 19665 1727204175.29082: calling self._execute() 19665 1727204175.29249: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204175.29263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204175.29281: variable 'omit' from source: magic vars 19665 1727204175.29843: variable 'ansible_distribution_major_version' from source: facts 19665 1727204175.29865: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204175.30012: variable 'network_state' from source: role '' defaults 19665 1727204175.30030: Evaluated conditional (network_state != {}): False 19665 1727204175.30042: when evaluation is False, skipping this task 19665 1727204175.30050: _execute() done 19665 1727204175.30057: dumping result to json 19665 1727204175.30066: done dumping result, returning 19665 1727204175.30079: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-0dcc-3ea6-000000000043] 19665 1727204175.30090: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000043 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204175.30257: no more pending results, returning what we have 19665 1727204175.30262: results queue empty 19665 1727204175.30263: checking for any_errors_fatal 19665 1727204175.30272: done checking for any_errors_fatal 19665 1727204175.30273: checking for max_fail_percentage 19665 1727204175.30275: done checking for max_fail_percentage 19665 1727204175.30276: checking to see if all hosts have failed and the running result is not ok 19665 1727204175.30277: done checking to see if all hosts have failed 19665 1727204175.30278: getting the remaining hosts for this loop 19665 1727204175.30280: done getting the remaining hosts for this loop 19665 1727204175.30285: getting the next task for host managed-node3 19665 1727204175.30292: done getting next task for host managed-node3 19665 1727204175.30297: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19665 1727204175.30299: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204175.30316: getting variables 19665 1727204175.30318: in VariableManager get_vars() 19665 1727204175.30361: Calling all_inventory to load vars for managed-node3 19665 1727204175.30366: Calling groups_inventory to load vars for managed-node3 19665 1727204175.30369: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204175.30382: Calling all_plugins_play to load vars for managed-node3 19665 1727204175.30385: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204175.30388: Calling groups_plugins_play to load vars for managed-node3 19665 1727204175.40309: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000043 19665 1727204175.40314: WORKER PROCESS EXITING 19665 1727204175.41798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204175.45867: done with get_vars() 19665 1727204175.45898: done getting variables 19665 1727204175.45952: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.186) 0:00:26.326 ***** 19665 1727204175.45982: entering _queue_task() for managed-node3/package 19665 1727204175.46331: worker is 1 (out of 1 available) 19665 1727204175.46346: exiting _queue_task() for managed-node3/package 19665 1727204175.46359: done queuing things up, now waiting for results queue to drain 19665 1727204175.46360: waiting for pending results... 19665 1727204175.46658: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19665 1727204175.46790: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000044 19665 1727204175.46819: variable 'ansible_search_path' from source: unknown 19665 1727204175.46828: variable 'ansible_search_path' from source: unknown 19665 1727204175.46876: calling self._execute() 19665 1727204175.46990: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204175.47002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204175.47016: variable 'omit' from source: magic vars 19665 1727204175.47424: variable 'ansible_distribution_major_version' from source: facts 19665 1727204175.47446: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204175.47587: variable 'network_state' from source: role '' defaults 19665 1727204175.47603: Evaluated conditional (network_state != {}): False 19665 1727204175.47611: when evaluation is False, skipping this task 19665 1727204175.47618: _execute() done 19665 1727204175.47626: dumping result to json 19665 1727204175.47633: done dumping result, returning 19665 1727204175.47647: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-0dcc-3ea6-000000000044] 19665 1727204175.47659: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000044 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204175.47836: no more pending results, returning what we have 19665 1727204175.47844: results queue empty 19665 1727204175.47845: checking for any_errors_fatal 19665 1727204175.47857: done checking for any_errors_fatal 19665 1727204175.47858: checking for max_fail_percentage 19665 1727204175.47860: done checking for max_fail_percentage 19665 1727204175.47861: checking to see if all hosts have failed and the running result is not ok 19665 1727204175.47862: done checking to see if all hosts have failed 19665 1727204175.47863: getting the remaining hosts for this loop 19665 1727204175.47866: done getting the remaining hosts for this loop 19665 1727204175.47871: getting the next task for host managed-node3 19665 1727204175.47878: done getting next task for host managed-node3 19665 1727204175.47883: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19665 1727204175.47885: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204175.47900: getting variables 19665 1727204175.47902: in VariableManager get_vars() 19665 1727204175.47947: Calling all_inventory to load vars for managed-node3 19665 1727204175.47950: Calling groups_inventory to load vars for managed-node3 19665 1727204175.47953: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204175.47968: Calling all_plugins_play to load vars for managed-node3 19665 1727204175.47972: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204175.47976: Calling groups_plugins_play to load vars for managed-node3 19665 1727204175.49021: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000044 19665 1727204175.49025: WORKER PROCESS EXITING 19665 1727204175.49943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204175.53993: done with get_vars() 19665 1727204175.54025: done getting variables 19665 1727204175.54093: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.081) 0:00:26.407 ***** 19665 1727204175.54124: entering _queue_task() for managed-node3/service 19665 1727204175.55156: worker is 1 (out of 1 available) 19665 1727204175.55174: exiting _queue_task() for managed-node3/service 19665 1727204175.55188: done queuing things up, now waiting for results queue to drain 19665 1727204175.55189: waiting for pending results... 19665 1727204175.56171: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19665 1727204175.56275: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000045 19665 1727204175.56288: variable 'ansible_search_path' from source: unknown 19665 1727204175.56291: variable 'ansible_search_path' from source: unknown 19665 1727204175.56330: calling self._execute() 19665 1727204175.56429: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204175.56433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204175.56443: variable 'omit' from source: magic vars 19665 1727204175.57535: variable 'ansible_distribution_major_version' from source: facts 19665 1727204175.57548: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204175.57670: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204175.58265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204175.62994: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204175.63278: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204175.63318: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204175.63353: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204175.63385: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204175.63458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.63556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.63559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.63562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.64070: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.64074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.64078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.64080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.64102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.64116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.64155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.64180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.64204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.64244: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.64256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.64728: variable 'network_connections' from source: play vars 19665 1727204175.64743: variable 'profile' from source: play vars 19665 1727204175.64815: variable 'profile' from source: play vars 19665 1727204175.64820: variable 'interface' from source: set_fact 19665 1727204175.65086: variable 'interface' from source: set_fact 19665 1727204175.65155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204175.65344: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204175.65583: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204175.65613: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204175.65644: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204175.65687: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204175.65710: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204175.65732: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.65757: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204175.66009: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204175.66255: variable 'network_connections' from source: play vars 19665 1727204175.66259: variable 'profile' from source: play vars 19665 1727204175.66323: variable 'profile' from source: play vars 19665 1727204175.66326: variable 'interface' from source: set_fact 19665 1727204175.66386: variable 'interface' from source: set_fact 19665 1727204175.66412: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19665 1727204175.66415: when evaluation is False, skipping this task 19665 1727204175.66419: _execute() done 19665 1727204175.66421: dumping result to json 19665 1727204175.66424: done dumping result, returning 19665 1727204175.66432: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0dcc-3ea6-000000000045] 19665 1727204175.66445: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000045 19665 1727204175.66527: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000045 19665 1727204175.66530: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19665 1727204175.66575: no more pending results, returning what we have 19665 1727204175.66579: results queue empty 19665 1727204175.66580: checking for any_errors_fatal 19665 1727204175.66588: done checking for any_errors_fatal 19665 1727204175.66588: checking for max_fail_percentage 19665 1727204175.66590: done checking for max_fail_percentage 19665 1727204175.66590: checking to see if all hosts have failed and the running result is not ok 19665 1727204175.66591: done checking to see if all hosts have failed 19665 1727204175.66592: getting the remaining hosts for this loop 19665 1727204175.66594: done getting the remaining hosts for this loop 19665 1727204175.66598: getting the next task for host managed-node3 19665 1727204175.66604: done getting next task for host managed-node3 19665 1727204175.66608: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19665 1727204175.66610: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204175.66622: getting variables 19665 1727204175.66624: in VariableManager get_vars() 19665 1727204175.66666: Calling all_inventory to load vars for managed-node3 19665 1727204175.66669: Calling groups_inventory to load vars for managed-node3 19665 1727204175.66671: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204175.66682: Calling all_plugins_play to load vars for managed-node3 19665 1727204175.66685: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204175.66688: Calling groups_plugins_play to load vars for managed-node3 19665 1727204175.68809: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204175.71180: done with get_vars() 19665 1727204175.71206: done getting variables 19665 1727204175.71275: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:56:15 -0400 (0:00:00.171) 0:00:26.579 ***** 19665 1727204175.71308: entering _queue_task() for managed-node3/service 19665 1727204175.71658: worker is 1 (out of 1 available) 19665 1727204175.71673: exiting _queue_task() for managed-node3/service 19665 1727204175.71690: done queuing things up, now waiting for results queue to drain 19665 1727204175.71692: waiting for pending results... 19665 1727204175.71981: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19665 1727204175.72107: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000046 19665 1727204175.72136: variable 'ansible_search_path' from source: unknown 19665 1727204175.72149: variable 'ansible_search_path' from source: unknown 19665 1727204175.72194: calling self._execute() 19665 1727204175.72305: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204175.72317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204175.72333: variable 'omit' from source: magic vars 19665 1727204175.72755: variable 'ansible_distribution_major_version' from source: facts 19665 1727204175.72779: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204175.72965: variable 'network_provider' from source: set_fact 19665 1727204175.72978: variable 'network_state' from source: role '' defaults 19665 1727204175.72993: Evaluated conditional (network_provider == "nm" or network_state != {}): True 19665 1727204175.73011: variable 'omit' from source: magic vars 19665 1727204175.73057: variable 'omit' from source: magic vars 19665 1727204175.73094: variable 'network_service_name' from source: role '' defaults 19665 1727204175.73172: variable 'network_service_name' from source: role '' defaults 19665 1727204175.73294: variable '__network_provider_setup' from source: role '' defaults 19665 1727204175.73305: variable '__network_service_name_default_nm' from source: role '' defaults 19665 1727204175.73383: variable '__network_service_name_default_nm' from source: role '' defaults 19665 1727204175.73396: variable '__network_packages_default_nm' from source: role '' defaults 19665 1727204175.73469: variable '__network_packages_default_nm' from source: role '' defaults 19665 1727204175.73712: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204175.77087: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204175.77163: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204175.77231: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204175.77277: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204175.77310: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204175.77404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.77448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.77482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.77526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.77555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.77604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.77632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.77671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.77714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.77731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.77974: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19665 1727204175.78092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.78119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.78149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.78200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.78218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.78320: variable 'ansible_python' from source: facts 19665 1727204175.78349: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19665 1727204175.78445: variable '__network_wpa_supplicant_required' from source: role '' defaults 19665 1727204175.78536: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19665 1727204175.78871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.78903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.78936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.78997: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.79085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.79137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204175.79309: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204175.79337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.79503: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204175.79524: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204175.79686: variable 'network_connections' from source: play vars 19665 1727204175.79841: variable 'profile' from source: play vars 19665 1727204175.79925: variable 'profile' from source: play vars 19665 1727204175.80058: variable 'interface' from source: set_fact 19665 1727204175.80128: variable 'interface' from source: set_fact 19665 1727204175.80349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204175.80574: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204175.80635: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204175.80688: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204175.80743: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204175.81513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204175.81607: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204175.81649: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204175.81862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204175.81985: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204175.82934: variable 'network_connections' from source: play vars 19665 1727204175.83023: variable 'profile' from source: play vars 19665 1727204175.83298: variable 'profile' from source: play vars 19665 1727204175.83354: variable 'interface' from source: set_fact 19665 1727204175.83430: variable 'interface' from source: set_fact 19665 1727204175.83708: variable '__network_packages_default_wireless' from source: role '' defaults 19665 1727204175.83912: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204175.84670: variable 'network_connections' from source: play vars 19665 1727204175.84681: variable 'profile' from source: play vars 19665 1727204175.84873: variable 'profile' from source: play vars 19665 1727204175.84884: variable 'interface' from source: set_fact 19665 1727204175.84965: variable 'interface' from source: set_fact 19665 1727204175.85103: variable '__network_packages_default_team' from source: role '' defaults 19665 1727204175.85190: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204175.85923: variable 'network_connections' from source: play vars 19665 1727204175.85933: variable 'profile' from source: play vars 19665 1727204175.86011: variable 'profile' from source: play vars 19665 1727204175.86077: variable 'interface' from source: set_fact 19665 1727204175.86249: variable 'interface' from source: set_fact 19665 1727204175.86404: variable '__network_service_name_default_initscripts' from source: role '' defaults 19665 1727204175.86473: variable '__network_service_name_default_initscripts' from source: role '' defaults 19665 1727204175.86513: variable '__network_packages_default_initscripts' from source: role '' defaults 19665 1727204175.86632: variable '__network_packages_default_initscripts' from source: role '' defaults 19665 1727204175.87176: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19665 1727204175.88128: variable 'network_connections' from source: play vars 19665 1727204175.88253: variable 'profile' from source: play vars 19665 1727204175.88316: variable 'profile' from source: play vars 19665 1727204175.88375: variable 'interface' from source: set_fact 19665 1727204175.88451: variable 'interface' from source: set_fact 19665 1727204175.88583: variable 'ansible_distribution' from source: facts 19665 1727204175.88593: variable '__network_rh_distros' from source: role '' defaults 19665 1727204175.88602: variable 'ansible_distribution_major_version' from source: facts 19665 1727204175.88619: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19665 1727204175.89012: variable 'ansible_distribution' from source: facts 19665 1727204175.89022: variable '__network_rh_distros' from source: role '' defaults 19665 1727204175.89032: variable 'ansible_distribution_major_version' from source: facts 19665 1727204175.89053: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19665 1727204175.89457: variable 'ansible_distribution' from source: facts 19665 1727204175.89468: variable '__network_rh_distros' from source: role '' defaults 19665 1727204175.89478: variable 'ansible_distribution_major_version' from source: facts 19665 1727204175.89520: variable 'network_provider' from source: set_fact 19665 1727204175.89680: variable 'omit' from source: magic vars 19665 1727204175.89716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204175.89751: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204175.89885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204175.89907: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204175.89923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204175.89961: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204175.89971: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204175.89980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204175.90088: Set connection var ansible_connection to ssh 19665 1727204175.90216: Set connection var ansible_shell_type to sh 19665 1727204175.90228: Set connection var ansible_timeout to 10 19665 1727204175.90324: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204175.90337: Set connection var ansible_shell_executable to /bin/sh 19665 1727204175.90353: Set connection var ansible_pipelining to False 19665 1727204175.90387: variable 'ansible_shell_executable' from source: unknown 19665 1727204175.90396: variable 'ansible_connection' from source: unknown 19665 1727204175.90404: variable 'ansible_module_compression' from source: unknown 19665 1727204175.90411: variable 'ansible_shell_type' from source: unknown 19665 1727204175.90419: variable 'ansible_shell_executable' from source: unknown 19665 1727204175.90428: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204175.90543: variable 'ansible_pipelining' from source: unknown 19665 1727204175.90552: variable 'ansible_timeout' from source: unknown 19665 1727204175.90561: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204175.90681: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204175.90768: variable 'omit' from source: magic vars 19665 1727204175.90780: starting attempt loop 19665 1727204175.90787: running the handler 19665 1727204175.90984: variable 'ansible_facts' from source: unknown 19665 1727204175.92733: _low_level_execute_command(): starting 19665 1727204175.92749: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204175.94897: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204175.94902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204175.94926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204175.94930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204175.94932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204175.94997: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204175.95009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204175.95077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204175.96724: stdout chunk (state=3): >>>/root <<< 19665 1727204175.96820: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204175.96910: stderr chunk (state=3): >>><<< 19665 1727204175.96914: stdout chunk (state=3): >>><<< 19665 1727204175.96974: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204175.96979: _low_level_execute_command(): starting 19665 1727204175.96983: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204175.9693544-21747-62689712154533 `" && echo ansible-tmp-1727204175.9693544-21747-62689712154533="` echo /root/.ansible/tmp/ansible-tmp-1727204175.9693544-21747-62689712154533 `" ) && sleep 0' 19665 1727204175.98642: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204175.98647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204175.98679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204175.98682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204175.98685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204175.98759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204175.98762: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204175.99402: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204175.99450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204176.01376: stdout chunk (state=3): >>>ansible-tmp-1727204175.9693544-21747-62689712154533=/root/.ansible/tmp/ansible-tmp-1727204175.9693544-21747-62689712154533 <<< 19665 1727204176.01481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204176.01577: stderr chunk (state=3): >>><<< 19665 1727204176.01580: stdout chunk (state=3): >>><<< 19665 1727204176.01773: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204175.9693544-21747-62689712154533=/root/.ansible/tmp/ansible-tmp-1727204175.9693544-21747-62689712154533 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204176.01776: variable 'ansible_module_compression' from source: unknown 19665 1727204176.01778: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 19665 1727204176.01780: variable 'ansible_facts' from source: unknown 19665 1727204176.01954: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204175.9693544-21747-62689712154533/AnsiballZ_systemd.py 19665 1727204176.02639: Sending initial data 19665 1727204176.02642: Sent initial data (155 bytes) 19665 1727204176.05328: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204176.05336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204176.05351: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204176.05367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204176.05408: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204176.05415: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204176.05424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.05437: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204176.05447: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204176.05453: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204176.05461: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204176.05472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204176.05494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204176.05503: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204176.05510: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204176.05520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.05598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204176.05613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204176.05616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204176.05694: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204176.07503: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204176.07543: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204176.07584: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpdzm1oxv8 /root/.ansible/tmp/ansible-tmp-1727204175.9693544-21747-62689712154533/AnsiballZ_systemd.py <<< 19665 1727204176.07637: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204176.10210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204176.10388: stderr chunk (state=3): >>><<< 19665 1727204176.10391: stdout chunk (state=3): >>><<< 19665 1727204176.10394: done transferring module to remote 19665 1727204176.10396: _low_level_execute_command(): starting 19665 1727204176.10398: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204175.9693544-21747-62689712154533/ /root/.ansible/tmp/ansible-tmp-1727204175.9693544-21747-62689712154533/AnsiballZ_systemd.py && sleep 0' 19665 1727204176.11131: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204176.11150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204176.11169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204176.11188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204176.11235: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204176.11251: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204176.11268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.11288: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204176.11301: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204176.11312: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204176.11322: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204176.11337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204176.11354: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204176.11369: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204176.11381: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204176.11396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.11477: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204176.11494: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204176.11509: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204176.11588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204176.13481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204176.13485: stdout chunk (state=3): >>><<< 19665 1727204176.13487: stderr chunk (state=3): >>><<< 19665 1727204176.13590: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204176.13594: _low_level_execute_command(): starting 19665 1727204176.13599: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204175.9693544-21747-62689712154533/AnsiballZ_systemd.py && sleep 0' 19665 1727204176.14566: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204176.14570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204176.14660: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204176.14682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204176.14685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.15686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204176.15711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204176.15822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204176.41220: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "16142336", "MemoryAvailable": "infinity", "CPUUsageNSec": "1391364000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": <<< 19665 1727204176.41246: stdout chunk (state=3): >>>"0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 19665 1727204176.42906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204176.42910: stdout chunk (state=3): >>><<< 19665 1727204176.42912: stderr chunk (state=3): >>><<< 19665 1727204176.43072: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "16142336", "MemoryAvailable": "infinity", "CPUUsageNSec": "1391364000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204176.43131: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204175.9693544-21747-62689712154533/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204176.43156: _low_level_execute_command(): starting 19665 1727204176.43172: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204175.9693544-21747-62689712154533/ > /dev/null 2>&1 && sleep 0' 19665 1727204176.43869: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204176.43886: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204176.43906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204176.43928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204176.43976: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204176.43990: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204176.44004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.44032: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204176.44045: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204176.44057: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204176.44071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204176.44085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204176.44102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204176.44114: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204176.44131: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204176.44150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.44231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204176.44261: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204176.44282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204176.44363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204176.46254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204176.46258: stdout chunk (state=3): >>><<< 19665 1727204176.46265: stderr chunk (state=3): >>><<< 19665 1727204176.46289: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204176.46296: handler run complete 19665 1727204176.46361: attempt loop complete, returning result 19665 1727204176.46367: _execute() done 19665 1727204176.46370: dumping result to json 19665 1727204176.46388: done dumping result, returning 19665 1727204176.46398: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-0dcc-3ea6-000000000046] 19665 1727204176.46403: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000046 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204176.46743: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000046 19665 1727204176.46747: WORKER PROCESS EXITING 19665 1727204176.46756: no more pending results, returning what we have 19665 1727204176.46760: results queue empty 19665 1727204176.46761: checking for any_errors_fatal 19665 1727204176.46769: done checking for any_errors_fatal 19665 1727204176.46770: checking for max_fail_percentage 19665 1727204176.46772: done checking for max_fail_percentage 19665 1727204176.46773: checking to see if all hosts have failed and the running result is not ok 19665 1727204176.46774: done checking to see if all hosts have failed 19665 1727204176.46776: getting the remaining hosts for this loop 19665 1727204176.46777: done getting the remaining hosts for this loop 19665 1727204176.46782: getting the next task for host managed-node3 19665 1727204176.46788: done getting next task for host managed-node3 19665 1727204176.46792: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19665 1727204176.46794: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204176.46806: getting variables 19665 1727204176.46807: in VariableManager get_vars() 19665 1727204176.46847: Calling all_inventory to load vars for managed-node3 19665 1727204176.46851: Calling groups_inventory to load vars for managed-node3 19665 1727204176.46854: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204176.46866: Calling all_plugins_play to load vars for managed-node3 19665 1727204176.46870: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204176.46873: Calling groups_plugins_play to load vars for managed-node3 19665 1727204176.48542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204176.50204: done with get_vars() 19665 1727204176.50225: done getting variables 19665 1727204176.50289: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:56:16 -0400 (0:00:00.790) 0:00:27.369 ***** 19665 1727204176.50317: entering _queue_task() for managed-node3/service 19665 1727204176.50554: worker is 1 (out of 1 available) 19665 1727204176.50570: exiting _queue_task() for managed-node3/service 19665 1727204176.50582: done queuing things up, now waiting for results queue to drain 19665 1727204176.50584: waiting for pending results... 19665 1727204176.50762: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19665 1727204176.50867: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000047 19665 1727204176.50875: variable 'ansible_search_path' from source: unknown 19665 1727204176.50879: variable 'ansible_search_path' from source: unknown 19665 1727204176.50908: calling self._execute() 19665 1727204176.50984: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204176.50987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204176.50995: variable 'omit' from source: magic vars 19665 1727204176.51286: variable 'ansible_distribution_major_version' from source: facts 19665 1727204176.51298: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204176.51381: variable 'network_provider' from source: set_fact 19665 1727204176.51385: Evaluated conditional (network_provider == "nm"): True 19665 1727204176.51452: variable '__network_wpa_supplicant_required' from source: role '' defaults 19665 1727204176.51514: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19665 1727204176.51633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204176.53329: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204176.53374: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204176.53402: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204176.53427: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204176.53452: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204176.53526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204176.53553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204176.53579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204176.53619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204176.53632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204176.53677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204176.53699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204176.53721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204176.53761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204176.53776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204176.53812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204176.53832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204176.53857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204176.53894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204176.53907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204176.54038: variable 'network_connections' from source: play vars 19665 1727204176.54050: variable 'profile' from source: play vars 19665 1727204176.54110: variable 'profile' from source: play vars 19665 1727204176.54113: variable 'interface' from source: set_fact 19665 1727204176.54170: variable 'interface' from source: set_fact 19665 1727204176.54230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204176.54377: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204176.54409: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204176.54437: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204176.54468: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204176.54506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204176.54524: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204176.54550: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204176.54573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204176.54621: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204176.54786: variable 'network_connections' from source: play vars 19665 1727204176.54789: variable 'profile' from source: play vars 19665 1727204176.54834: variable 'profile' from source: play vars 19665 1727204176.54837: variable 'interface' from source: set_fact 19665 1727204176.54883: variable 'interface' from source: set_fact 19665 1727204176.54904: Evaluated conditional (__network_wpa_supplicant_required): False 19665 1727204176.54909: when evaluation is False, skipping this task 19665 1727204176.54912: _execute() done 19665 1727204176.54922: dumping result to json 19665 1727204176.54924: done dumping result, returning 19665 1727204176.54927: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-0dcc-3ea6-000000000047] 19665 1727204176.54929: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000047 19665 1727204176.55018: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000047 19665 1727204176.55021: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 19665 1727204176.55082: no more pending results, returning what we have 19665 1727204176.55086: results queue empty 19665 1727204176.55087: checking for any_errors_fatal 19665 1727204176.55105: done checking for any_errors_fatal 19665 1727204176.55106: checking for max_fail_percentage 19665 1727204176.55108: done checking for max_fail_percentage 19665 1727204176.55108: checking to see if all hosts have failed and the running result is not ok 19665 1727204176.55109: done checking to see if all hosts have failed 19665 1727204176.55110: getting the remaining hosts for this loop 19665 1727204176.55112: done getting the remaining hosts for this loop 19665 1727204176.55116: getting the next task for host managed-node3 19665 1727204176.55122: done getting next task for host managed-node3 19665 1727204176.55127: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 19665 1727204176.55129: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204176.55142: getting variables 19665 1727204176.55144: in VariableManager get_vars() 19665 1727204176.55180: Calling all_inventory to load vars for managed-node3 19665 1727204176.55182: Calling groups_inventory to load vars for managed-node3 19665 1727204176.55184: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204176.55193: Calling all_plugins_play to load vars for managed-node3 19665 1727204176.55195: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204176.55198: Calling groups_plugins_play to load vars for managed-node3 19665 1727204176.55999: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204176.57020: done with get_vars() 19665 1727204176.57035: done getting variables 19665 1727204176.57080: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:56:16 -0400 (0:00:00.067) 0:00:27.437 ***** 19665 1727204176.57103: entering _queue_task() for managed-node3/service 19665 1727204176.57328: worker is 1 (out of 1 available) 19665 1727204176.57341: exiting _queue_task() for managed-node3/service 19665 1727204176.57353: done queuing things up, now waiting for results queue to drain 19665 1727204176.57355: waiting for pending results... 19665 1727204176.57529: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 19665 1727204176.57608: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000048 19665 1727204176.57619: variable 'ansible_search_path' from source: unknown 19665 1727204176.57623: variable 'ansible_search_path' from source: unknown 19665 1727204176.57653: calling self._execute() 19665 1727204176.57729: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204176.57734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204176.57745: variable 'omit' from source: magic vars 19665 1727204176.58019: variable 'ansible_distribution_major_version' from source: facts 19665 1727204176.58030: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204176.58114: variable 'network_provider' from source: set_fact 19665 1727204176.58120: Evaluated conditional (network_provider == "initscripts"): False 19665 1727204176.58122: when evaluation is False, skipping this task 19665 1727204176.58125: _execute() done 19665 1727204176.58128: dumping result to json 19665 1727204176.58130: done dumping result, returning 19665 1727204176.58136: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-0dcc-3ea6-000000000048] 19665 1727204176.58145: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000048 19665 1727204176.58228: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000048 19665 1727204176.58231: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204176.58281: no more pending results, returning what we have 19665 1727204176.58285: results queue empty 19665 1727204176.58286: checking for any_errors_fatal 19665 1727204176.58293: done checking for any_errors_fatal 19665 1727204176.58294: checking for max_fail_percentage 19665 1727204176.58295: done checking for max_fail_percentage 19665 1727204176.58296: checking to see if all hosts have failed and the running result is not ok 19665 1727204176.58297: done checking to see if all hosts have failed 19665 1727204176.58298: getting the remaining hosts for this loop 19665 1727204176.58300: done getting the remaining hosts for this loop 19665 1727204176.58303: getting the next task for host managed-node3 19665 1727204176.58308: done getting next task for host managed-node3 19665 1727204176.58313: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19665 1727204176.58315: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204176.58328: getting variables 19665 1727204176.58330: in VariableManager get_vars() 19665 1727204176.58368: Calling all_inventory to load vars for managed-node3 19665 1727204176.58371: Calling groups_inventory to load vars for managed-node3 19665 1727204176.58373: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204176.58382: Calling all_plugins_play to load vars for managed-node3 19665 1727204176.58384: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204176.58387: Calling groups_plugins_play to load vars for managed-node3 19665 1727204176.59169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204176.60119: done with get_vars() 19665 1727204176.60137: done getting variables 19665 1727204176.60184: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:56:16 -0400 (0:00:00.031) 0:00:27.468 ***** 19665 1727204176.60209: entering _queue_task() for managed-node3/copy 19665 1727204176.60440: worker is 1 (out of 1 available) 19665 1727204176.60456: exiting _queue_task() for managed-node3/copy 19665 1727204176.60471: done queuing things up, now waiting for results queue to drain 19665 1727204176.60473: waiting for pending results... 19665 1727204176.60649: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19665 1727204176.60728: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000049 19665 1727204176.60739: variable 'ansible_search_path' from source: unknown 19665 1727204176.60744: variable 'ansible_search_path' from source: unknown 19665 1727204176.60774: calling self._execute() 19665 1727204176.60853: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204176.60857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204176.60869: variable 'omit' from source: magic vars 19665 1727204176.61146: variable 'ansible_distribution_major_version' from source: facts 19665 1727204176.61158: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204176.61559: variable 'network_provider' from source: set_fact 19665 1727204176.61562: Evaluated conditional (network_provider == "initscripts"): False 19665 1727204176.61567: when evaluation is False, skipping this task 19665 1727204176.61569: _execute() done 19665 1727204176.61571: dumping result to json 19665 1727204176.61573: done dumping result, returning 19665 1727204176.61576: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-0dcc-3ea6-000000000049] 19665 1727204176.61578: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000049 19665 1727204176.61642: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000049 19665 1727204176.61645: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 19665 1727204176.61697: no more pending results, returning what we have 19665 1727204176.61700: results queue empty 19665 1727204176.61700: checking for any_errors_fatal 19665 1727204176.61705: done checking for any_errors_fatal 19665 1727204176.61706: checking for max_fail_percentage 19665 1727204176.61707: done checking for max_fail_percentage 19665 1727204176.61708: checking to see if all hosts have failed and the running result is not ok 19665 1727204176.61709: done checking to see if all hosts have failed 19665 1727204176.61710: getting the remaining hosts for this loop 19665 1727204176.61711: done getting the remaining hosts for this loop 19665 1727204176.61714: getting the next task for host managed-node3 19665 1727204176.61719: done getting next task for host managed-node3 19665 1727204176.61722: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19665 1727204176.61724: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204176.61735: getting variables 19665 1727204176.61737: in VariableManager get_vars() 19665 1727204176.61772: Calling all_inventory to load vars for managed-node3 19665 1727204176.61775: Calling groups_inventory to load vars for managed-node3 19665 1727204176.61777: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204176.61785: Calling all_plugins_play to load vars for managed-node3 19665 1727204176.61788: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204176.61790: Calling groups_plugins_play to load vars for managed-node3 19665 1727204176.63212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204176.64135: done with get_vars() 19665 1727204176.64155: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:56:16 -0400 (0:00:00.040) 0:00:27.508 ***** 19665 1727204176.64219: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 19665 1727204176.64465: worker is 1 (out of 1 available) 19665 1727204176.64479: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 19665 1727204176.64492: done queuing things up, now waiting for results queue to drain 19665 1727204176.64493: waiting for pending results... 19665 1727204176.64667: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19665 1727204176.64743: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000004a 19665 1727204176.64753: variable 'ansible_search_path' from source: unknown 19665 1727204176.64756: variable 'ansible_search_path' from source: unknown 19665 1727204176.64784: calling self._execute() 19665 1727204176.64859: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204176.64865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204176.64873: variable 'omit' from source: magic vars 19665 1727204176.65153: variable 'ansible_distribution_major_version' from source: facts 19665 1727204176.65166: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204176.65171: variable 'omit' from source: magic vars 19665 1727204176.65198: variable 'omit' from source: magic vars 19665 1727204176.65312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204176.66844: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204176.66894: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204176.66921: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204176.66950: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204176.66972: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204176.67032: variable 'network_provider' from source: set_fact 19665 1727204176.67128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204176.67163: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204176.67182: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204176.67210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204176.67223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204176.67280: variable 'omit' from source: magic vars 19665 1727204176.67362: variable 'omit' from source: magic vars 19665 1727204176.67433: variable 'network_connections' from source: play vars 19665 1727204176.67445: variable 'profile' from source: play vars 19665 1727204176.67490: variable 'profile' from source: play vars 19665 1727204176.67494: variable 'interface' from source: set_fact 19665 1727204176.67537: variable 'interface' from source: set_fact 19665 1727204176.67645: variable 'omit' from source: magic vars 19665 1727204176.67653: variable '__lsr_ansible_managed' from source: task vars 19665 1727204176.67696: variable '__lsr_ansible_managed' from source: task vars 19665 1727204176.67825: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 19665 1727204176.67977: Loaded config def from plugin (lookup/template) 19665 1727204176.67981: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 19665 1727204176.68001: File lookup term: get_ansible_managed.j2 19665 1727204176.68004: variable 'ansible_search_path' from source: unknown 19665 1727204176.68007: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 19665 1727204176.68020: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 19665 1727204176.68032: variable 'ansible_search_path' from source: unknown 19665 1727204176.71600: variable 'ansible_managed' from source: unknown 19665 1727204176.71692: variable 'omit' from source: magic vars 19665 1727204176.71713: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204176.71732: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204176.71752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204176.71766: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204176.71777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204176.71799: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204176.71802: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204176.71804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204176.71869: Set connection var ansible_connection to ssh 19665 1727204176.71876: Set connection var ansible_shell_type to sh 19665 1727204176.71883: Set connection var ansible_timeout to 10 19665 1727204176.71888: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204176.71895: Set connection var ansible_shell_executable to /bin/sh 19665 1727204176.71901: Set connection var ansible_pipelining to False 19665 1727204176.71917: variable 'ansible_shell_executable' from source: unknown 19665 1727204176.71920: variable 'ansible_connection' from source: unknown 19665 1727204176.71923: variable 'ansible_module_compression' from source: unknown 19665 1727204176.71925: variable 'ansible_shell_type' from source: unknown 19665 1727204176.71927: variable 'ansible_shell_executable' from source: unknown 19665 1727204176.71929: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204176.71933: variable 'ansible_pipelining' from source: unknown 19665 1727204176.71936: variable 'ansible_timeout' from source: unknown 19665 1727204176.71940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204176.72038: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204176.72049: variable 'omit' from source: magic vars 19665 1727204176.72054: starting attempt loop 19665 1727204176.72057: running the handler 19665 1727204176.72070: _low_level_execute_command(): starting 19665 1727204176.72076: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204176.72596: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204176.72605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204176.72636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204176.72649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204176.72661: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.72709: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204176.72721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204176.72784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204176.74388: stdout chunk (state=3): >>>/root <<< 19665 1727204176.74494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204176.74554: stderr chunk (state=3): >>><<< 19665 1727204176.74560: stdout chunk (state=3): >>><<< 19665 1727204176.74584: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204176.74594: _low_level_execute_command(): starting 19665 1727204176.74600: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204176.7458441-21832-4116540516358 `" && echo ansible-tmp-1727204176.7458441-21832-4116540516358="` echo /root/.ansible/tmp/ansible-tmp-1727204176.7458441-21832-4116540516358 `" ) && sleep 0' 19665 1727204176.75069: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204176.75076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204176.75108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.75120: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204176.75130: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.75179: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204176.75191: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204176.75245: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204176.77080: stdout chunk (state=3): >>>ansible-tmp-1727204176.7458441-21832-4116540516358=/root/.ansible/tmp/ansible-tmp-1727204176.7458441-21832-4116540516358 <<< 19665 1727204176.77195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204176.77250: stderr chunk (state=3): >>><<< 19665 1727204176.77253: stdout chunk (state=3): >>><<< 19665 1727204176.77275: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204176.7458441-21832-4116540516358=/root/.ansible/tmp/ansible-tmp-1727204176.7458441-21832-4116540516358 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204176.77313: variable 'ansible_module_compression' from source: unknown 19665 1727204176.77351: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 19665 1727204176.77377: variable 'ansible_facts' from source: unknown 19665 1727204176.77443: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204176.7458441-21832-4116540516358/AnsiballZ_network_connections.py 19665 1727204176.77551: Sending initial data 19665 1727204176.77555: Sent initial data (166 bytes) 19665 1727204176.78247: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204176.78253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204176.78285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.78302: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.78347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204176.78360: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204176.78373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204176.78418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204176.80094: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204176.80130: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204176.80175: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp_y92_9x2 /root/.ansible/tmp/ansible-tmp-1727204176.7458441-21832-4116540516358/AnsiballZ_network_connections.py <<< 19665 1727204176.80212: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204176.81347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204176.81455: stderr chunk (state=3): >>><<< 19665 1727204176.81460: stdout chunk (state=3): >>><<< 19665 1727204176.81480: done transferring module to remote 19665 1727204176.81489: _low_level_execute_command(): starting 19665 1727204176.81494: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204176.7458441-21832-4116540516358/ /root/.ansible/tmp/ansible-tmp-1727204176.7458441-21832-4116540516358/AnsiballZ_network_connections.py && sleep 0' 19665 1727204176.81949: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204176.81961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204176.81990: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204176.82006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.82055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204176.82070: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204176.82117: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204176.83895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204176.83930: stderr chunk (state=3): >>><<< 19665 1727204176.83933: stdout chunk (state=3): >>><<< 19665 1727204176.83970: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204176.83973: _low_level_execute_command(): starting 19665 1727204176.83975: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204176.7458441-21832-4116540516358/AnsiballZ_network_connections.py && sleep 0' 19665 1727204176.84680: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204176.84696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204176.84712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204176.84731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204176.84778: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204176.84791: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204176.84806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.84826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204176.84841: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204176.84853: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204176.84869: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204176.84884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204176.84901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204176.84915: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204176.84927: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204176.84943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204176.85019: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204176.85044: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204176.85062: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204176.85147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204177.12805: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 19665 1727204177.14497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204177.14502: stdout chunk (state=3): >>><<< 19665 1727204177.14504: stderr chunk (state=3): >>><<< 19665 1727204177.14652: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204177.14656: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204176.7458441-21832-4116540516358/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204177.14659: _low_level_execute_command(): starting 19665 1727204177.14661: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204176.7458441-21832-4116540516358/ > /dev/null 2>&1 && sleep 0' 19665 1727204177.15299: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204177.15313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204177.15334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.15356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204177.15402: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204177.15414: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204177.15429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.15453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204177.15467: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204177.15478: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204177.15489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204177.15501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.15516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204177.15527: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204177.15543: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204177.15561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.15643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204177.15668: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204177.15683: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204177.15758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204177.17595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204177.17681: stderr chunk (state=3): >>><<< 19665 1727204177.17686: stdout chunk (state=3): >>><<< 19665 1727204177.17708: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204177.17714: handler run complete 19665 1727204177.17744: attempt loop complete, returning result 19665 1727204177.17747: _execute() done 19665 1727204177.17749: dumping result to json 19665 1727204177.17751: done dumping result, returning 19665 1727204177.17762: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-0dcc-3ea6-00000000004a] 19665 1727204177.17768: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000004a 19665 1727204177.17875: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000004a 19665 1727204177.17883: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 19665 1727204177.18079: no more pending results, returning what we have 19665 1727204177.18083: results queue empty 19665 1727204177.18084: checking for any_errors_fatal 19665 1727204177.18093: done checking for any_errors_fatal 19665 1727204177.18094: checking for max_fail_percentage 19665 1727204177.18096: done checking for max_fail_percentage 19665 1727204177.18096: checking to see if all hosts have failed and the running result is not ok 19665 1727204177.18097: done checking to see if all hosts have failed 19665 1727204177.18098: getting the remaining hosts for this loop 19665 1727204177.18100: done getting the remaining hosts for this loop 19665 1727204177.18104: getting the next task for host managed-node3 19665 1727204177.18110: done getting next task for host managed-node3 19665 1727204177.18114: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 19665 1727204177.18116: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204177.18125: getting variables 19665 1727204177.18127: in VariableManager get_vars() 19665 1727204177.18165: Calling all_inventory to load vars for managed-node3 19665 1727204177.18168: Calling groups_inventory to load vars for managed-node3 19665 1727204177.18171: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204177.18181: Calling all_plugins_play to load vars for managed-node3 19665 1727204177.18184: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204177.18187: Calling groups_plugins_play to load vars for managed-node3 19665 1727204177.19874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204177.23960: done with get_vars() 19665 1727204177.24005: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:56:17 -0400 (0:00:00.599) 0:00:28.108 ***** 19665 1727204177.24220: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 19665 1727204177.24935: worker is 1 (out of 1 available) 19665 1727204177.24949: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 19665 1727204177.24960: done queuing things up, now waiting for results queue to drain 19665 1727204177.24962: waiting for pending results... 19665 1727204177.25516: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 19665 1727204177.25682: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000004b 19665 1727204177.25705: variable 'ansible_search_path' from source: unknown 19665 1727204177.25726: variable 'ansible_search_path' from source: unknown 19665 1727204177.25769: calling self._execute() 19665 1727204177.25886: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204177.25892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204177.25904: variable 'omit' from source: magic vars 19665 1727204177.26328: variable 'ansible_distribution_major_version' from source: facts 19665 1727204177.26341: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204177.26481: variable 'network_state' from source: role '' defaults 19665 1727204177.26489: Evaluated conditional (network_state != {}): False 19665 1727204177.26492: when evaluation is False, skipping this task 19665 1727204177.26495: _execute() done 19665 1727204177.26497: dumping result to json 19665 1727204177.26500: done dumping result, returning 19665 1727204177.26508: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-0dcc-3ea6-00000000004b] 19665 1727204177.26519: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000004b 19665 1727204177.26609: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000004b 19665 1727204177.26612: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204177.26675: no more pending results, returning what we have 19665 1727204177.26679: results queue empty 19665 1727204177.26680: checking for any_errors_fatal 19665 1727204177.26692: done checking for any_errors_fatal 19665 1727204177.26692: checking for max_fail_percentage 19665 1727204177.26694: done checking for max_fail_percentage 19665 1727204177.26695: checking to see if all hosts have failed and the running result is not ok 19665 1727204177.26696: done checking to see if all hosts have failed 19665 1727204177.26697: getting the remaining hosts for this loop 19665 1727204177.26699: done getting the remaining hosts for this loop 19665 1727204177.26703: getting the next task for host managed-node3 19665 1727204177.26710: done getting next task for host managed-node3 19665 1727204177.26714: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19665 1727204177.26716: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204177.26732: getting variables 19665 1727204177.26734: in VariableManager get_vars() 19665 1727204177.26778: Calling all_inventory to load vars for managed-node3 19665 1727204177.26781: Calling groups_inventory to load vars for managed-node3 19665 1727204177.26784: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204177.26797: Calling all_plugins_play to load vars for managed-node3 19665 1727204177.26801: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204177.26804: Calling groups_plugins_play to load vars for managed-node3 19665 1727204177.28788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204177.30891: done with get_vars() 19665 1727204177.30922: done getting variables 19665 1727204177.30988: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:56:17 -0400 (0:00:00.068) 0:00:28.176 ***** 19665 1727204177.31026: entering _queue_task() for managed-node3/debug 19665 1727204177.31429: worker is 1 (out of 1 available) 19665 1727204177.31443: exiting _queue_task() for managed-node3/debug 19665 1727204177.31460: done queuing things up, now waiting for results queue to drain 19665 1727204177.31462: waiting for pending results... 19665 1727204177.31790: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19665 1727204177.32039: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000004c 19665 1727204177.32056: variable 'ansible_search_path' from source: unknown 19665 1727204177.32059: variable 'ansible_search_path' from source: unknown 19665 1727204177.32097: calling self._execute() 19665 1727204177.32291: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204177.32294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204177.32309: variable 'omit' from source: magic vars 19665 1727204177.32804: variable 'ansible_distribution_major_version' from source: facts 19665 1727204177.32816: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204177.32823: variable 'omit' from source: magic vars 19665 1727204177.32907: variable 'omit' from source: magic vars 19665 1727204177.32945: variable 'omit' from source: magic vars 19665 1727204177.32989: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204177.33026: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204177.33050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204177.33069: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204177.33085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204177.33116: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204177.33124: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204177.33129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204177.33225: Set connection var ansible_connection to ssh 19665 1727204177.33239: Set connection var ansible_shell_type to sh 19665 1727204177.33248: Set connection var ansible_timeout to 10 19665 1727204177.33253: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204177.33261: Set connection var ansible_shell_executable to /bin/sh 19665 1727204177.33270: Set connection var ansible_pipelining to False 19665 1727204177.33293: variable 'ansible_shell_executable' from source: unknown 19665 1727204177.33300: variable 'ansible_connection' from source: unknown 19665 1727204177.33304: variable 'ansible_module_compression' from source: unknown 19665 1727204177.33306: variable 'ansible_shell_type' from source: unknown 19665 1727204177.33309: variable 'ansible_shell_executable' from source: unknown 19665 1727204177.33311: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204177.33314: variable 'ansible_pipelining' from source: unknown 19665 1727204177.33316: variable 'ansible_timeout' from source: unknown 19665 1727204177.33320: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204177.33477: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204177.33488: variable 'omit' from source: magic vars 19665 1727204177.33493: starting attempt loop 19665 1727204177.33496: running the handler 19665 1727204177.33795: variable '__network_connections_result' from source: set_fact 19665 1727204177.33798: handler run complete 19665 1727204177.33801: attempt loop complete, returning result 19665 1727204177.33802: _execute() done 19665 1727204177.33804: dumping result to json 19665 1727204177.33806: done dumping result, returning 19665 1727204177.33807: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-0dcc-3ea6-00000000004c] 19665 1727204177.33809: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000004c 19665 1727204177.33875: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000004c 19665 1727204177.33878: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 19665 1727204177.33946: no more pending results, returning what we have 19665 1727204177.33952: results queue empty 19665 1727204177.33953: checking for any_errors_fatal 19665 1727204177.33962: done checking for any_errors_fatal 19665 1727204177.33963: checking for max_fail_percentage 19665 1727204177.33966: done checking for max_fail_percentage 19665 1727204177.33967: checking to see if all hosts have failed and the running result is not ok 19665 1727204177.33968: done checking to see if all hosts have failed 19665 1727204177.33969: getting the remaining hosts for this loop 19665 1727204177.33970: done getting the remaining hosts for this loop 19665 1727204177.33975: getting the next task for host managed-node3 19665 1727204177.33983: done getting next task for host managed-node3 19665 1727204177.33987: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19665 1727204177.33989: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204177.34000: getting variables 19665 1727204177.34002: in VariableManager get_vars() 19665 1727204177.34044: Calling all_inventory to load vars for managed-node3 19665 1727204177.34047: Calling groups_inventory to load vars for managed-node3 19665 1727204177.34049: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204177.34061: Calling all_plugins_play to load vars for managed-node3 19665 1727204177.34065: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204177.34069: Calling groups_plugins_play to load vars for managed-node3 19665 1727204177.35986: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204177.38399: done with get_vars() 19665 1727204177.38425: done getting variables 19665 1727204177.38496: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:56:17 -0400 (0:00:00.074) 0:00:28.251 ***** 19665 1727204177.38528: entering _queue_task() for managed-node3/debug 19665 1727204177.38980: worker is 1 (out of 1 available) 19665 1727204177.38998: exiting _queue_task() for managed-node3/debug 19665 1727204177.39012: done queuing things up, now waiting for results queue to drain 19665 1727204177.39014: waiting for pending results... 19665 1727204177.39390: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19665 1727204177.39503: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000004d 19665 1727204177.39516: variable 'ansible_search_path' from source: unknown 19665 1727204177.39521: variable 'ansible_search_path' from source: unknown 19665 1727204177.39571: calling self._execute() 19665 1727204177.39672: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204177.39703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204177.39707: variable 'omit' from source: magic vars 19665 1727204177.40270: variable 'ansible_distribution_major_version' from source: facts 19665 1727204177.40274: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204177.40276: variable 'omit' from source: magic vars 19665 1727204177.40279: variable 'omit' from source: magic vars 19665 1727204177.40281: variable 'omit' from source: magic vars 19665 1727204177.40354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204177.40360: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204177.40384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204177.40407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204177.40426: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204177.40461: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204177.40467: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204177.40470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204177.40576: Set connection var ansible_connection to ssh 19665 1727204177.40583: Set connection var ansible_shell_type to sh 19665 1727204177.40589: Set connection var ansible_timeout to 10 19665 1727204177.40595: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204177.40603: Set connection var ansible_shell_executable to /bin/sh 19665 1727204177.40611: Set connection var ansible_pipelining to False 19665 1727204177.40717: variable 'ansible_shell_executable' from source: unknown 19665 1727204177.40721: variable 'ansible_connection' from source: unknown 19665 1727204177.40724: variable 'ansible_module_compression' from source: unknown 19665 1727204177.40726: variable 'ansible_shell_type' from source: unknown 19665 1727204177.40728: variable 'ansible_shell_executable' from source: unknown 19665 1727204177.40731: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204177.40733: variable 'ansible_pipelining' from source: unknown 19665 1727204177.40735: variable 'ansible_timeout' from source: unknown 19665 1727204177.40748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204177.40901: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204177.40913: variable 'omit' from source: magic vars 19665 1727204177.40919: starting attempt loop 19665 1727204177.40922: running the handler 19665 1727204177.40978: variable '__network_connections_result' from source: set_fact 19665 1727204177.41056: variable '__network_connections_result' from source: set_fact 19665 1727204177.41168: handler run complete 19665 1727204177.41200: attempt loop complete, returning result 19665 1727204177.41203: _execute() done 19665 1727204177.41206: dumping result to json 19665 1727204177.41208: done dumping result, returning 19665 1727204177.41218: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-0dcc-3ea6-00000000004d] 19665 1727204177.41223: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000004d ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 19665 1727204177.41407: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000004d 19665 1727204177.41416: no more pending results, returning what we have 19665 1727204177.41420: results queue empty 19665 1727204177.41421: checking for any_errors_fatal 19665 1727204177.41427: done checking for any_errors_fatal 19665 1727204177.41428: checking for max_fail_percentage 19665 1727204177.41430: done checking for max_fail_percentage 19665 1727204177.41431: checking to see if all hosts have failed and the running result is not ok 19665 1727204177.41432: done checking to see if all hosts have failed 19665 1727204177.41433: getting the remaining hosts for this loop 19665 1727204177.41435: done getting the remaining hosts for this loop 19665 1727204177.41442: getting the next task for host managed-node3 19665 1727204177.41449: done getting next task for host managed-node3 19665 1727204177.41453: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19665 1727204177.41455: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204177.41468: WORKER PROCESS EXITING 19665 1727204177.41474: getting variables 19665 1727204177.41476: in VariableManager get_vars() 19665 1727204177.41515: Calling all_inventory to load vars for managed-node3 19665 1727204177.41518: Calling groups_inventory to load vars for managed-node3 19665 1727204177.41520: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204177.41532: Calling all_plugins_play to load vars for managed-node3 19665 1727204177.41536: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204177.41542: Calling groups_plugins_play to load vars for managed-node3 19665 1727204177.43483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204177.45569: done with get_vars() 19665 1727204177.45594: done getting variables 19665 1727204177.45658: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:56:17 -0400 (0:00:00.071) 0:00:28.323 ***** 19665 1727204177.45697: entering _queue_task() for managed-node3/debug 19665 1727204177.45999: worker is 1 (out of 1 available) 19665 1727204177.46014: exiting _queue_task() for managed-node3/debug 19665 1727204177.46028: done queuing things up, now waiting for results queue to drain 19665 1727204177.46030: waiting for pending results... 19665 1727204177.46357: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19665 1727204177.46448: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000004e 19665 1727204177.46461: variable 'ansible_search_path' from source: unknown 19665 1727204177.46467: variable 'ansible_search_path' from source: unknown 19665 1727204177.46506: calling self._execute() 19665 1727204177.46797: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204177.46801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204177.46812: variable 'omit' from source: magic vars 19665 1727204177.47368: variable 'ansible_distribution_major_version' from source: facts 19665 1727204177.47382: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204177.47586: variable 'network_state' from source: role '' defaults 19665 1727204177.47596: Evaluated conditional (network_state != {}): False 19665 1727204177.47599: when evaluation is False, skipping this task 19665 1727204177.47602: _execute() done 19665 1727204177.47605: dumping result to json 19665 1727204177.47608: done dumping result, returning 19665 1727204177.47615: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-0dcc-3ea6-00000000004e] 19665 1727204177.47622: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000004e 19665 1727204177.47718: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000004e 19665 1727204177.47720: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 19665 1727204177.47785: no more pending results, returning what we have 19665 1727204177.47789: results queue empty 19665 1727204177.47790: checking for any_errors_fatal 19665 1727204177.47802: done checking for any_errors_fatal 19665 1727204177.47803: checking for max_fail_percentage 19665 1727204177.47805: done checking for max_fail_percentage 19665 1727204177.47806: checking to see if all hosts have failed and the running result is not ok 19665 1727204177.47807: done checking to see if all hosts have failed 19665 1727204177.47807: getting the remaining hosts for this loop 19665 1727204177.47809: done getting the remaining hosts for this loop 19665 1727204177.47814: getting the next task for host managed-node3 19665 1727204177.47822: done getting next task for host managed-node3 19665 1727204177.47826: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 19665 1727204177.47829: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204177.47845: getting variables 19665 1727204177.47847: in VariableManager get_vars() 19665 1727204177.47887: Calling all_inventory to load vars for managed-node3 19665 1727204177.47890: Calling groups_inventory to load vars for managed-node3 19665 1727204177.47893: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204177.47906: Calling all_plugins_play to load vars for managed-node3 19665 1727204177.47909: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204177.47913: Calling groups_plugins_play to load vars for managed-node3 19665 1727204177.49713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204177.51847: done with get_vars() 19665 1727204177.51880: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:56:17 -0400 (0:00:00.066) 0:00:28.389 ***** 19665 1727204177.52315: entering _queue_task() for managed-node3/ping 19665 1727204177.55131: worker is 1 (out of 1 available) 19665 1727204177.55148: exiting _queue_task() for managed-node3/ping 19665 1727204177.55161: done queuing things up, now waiting for results queue to drain 19665 1727204177.55166: waiting for pending results... 19665 1727204177.56091: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 19665 1727204177.56312: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000004f 19665 1727204177.56325: variable 'ansible_search_path' from source: unknown 19665 1727204177.56328: variable 'ansible_search_path' from source: unknown 19665 1727204177.56487: calling self._execute() 19665 1727204177.56702: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204177.56707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204177.56719: variable 'omit' from source: magic vars 19665 1727204177.57599: variable 'ansible_distribution_major_version' from source: facts 19665 1727204177.57614: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204177.57621: variable 'omit' from source: magic vars 19665 1727204177.57781: variable 'omit' from source: magic vars 19665 1727204177.57816: variable 'omit' from source: magic vars 19665 1727204177.57973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204177.58012: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204177.58034: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204177.58056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204177.58072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204177.58105: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204177.58108: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204177.58112: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204177.58442: Set connection var ansible_connection to ssh 19665 1727204177.58452: Set connection var ansible_shell_type to sh 19665 1727204177.58458: Set connection var ansible_timeout to 10 19665 1727204177.58465: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204177.58473: Set connection var ansible_shell_executable to /bin/sh 19665 1727204177.58481: Set connection var ansible_pipelining to False 19665 1727204177.58618: variable 'ansible_shell_executable' from source: unknown 19665 1727204177.58621: variable 'ansible_connection' from source: unknown 19665 1727204177.58624: variable 'ansible_module_compression' from source: unknown 19665 1727204177.58627: variable 'ansible_shell_type' from source: unknown 19665 1727204177.58629: variable 'ansible_shell_executable' from source: unknown 19665 1727204177.58633: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204177.58637: variable 'ansible_pipelining' from source: unknown 19665 1727204177.58639: variable 'ansible_timeout' from source: unknown 19665 1727204177.58647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204177.59085: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204177.59097: variable 'omit' from source: magic vars 19665 1727204177.59101: starting attempt loop 19665 1727204177.59105: running the handler 19665 1727204177.59118: _low_level_execute_command(): starting 19665 1727204177.59127: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204177.62171: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204177.62186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204177.62203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.62216: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204177.62262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204177.62320: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204177.62331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.62348: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204177.62358: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204177.62365: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204177.62375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204177.62384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.62397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204177.62404: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204177.62411: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204177.62426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.62507: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204177.62657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204177.62672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204177.62873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204177.64403: stdout chunk (state=3): >>>/root <<< 19665 1727204177.64587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204177.64592: stdout chunk (state=3): >>><<< 19665 1727204177.64603: stderr chunk (state=3): >>><<< 19665 1727204177.64628: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204177.64644: _low_level_execute_command(): starting 19665 1727204177.64651: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204177.6462746-21951-58564694355196 `" && echo ansible-tmp-1727204177.6462746-21951-58564694355196="` echo /root/.ansible/tmp/ansible-tmp-1727204177.6462746-21951-58564694355196 `" ) && sleep 0' 19665 1727204177.66294: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204177.66302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204177.66312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.66326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204177.66373: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204177.66380: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204177.66390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.66485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204177.66494: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204177.66500: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204177.66508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204177.66517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.66528: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204177.66538: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204177.66546: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204177.66555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.66633: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204177.66708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204177.66721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204177.66800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204177.68712: stdout chunk (state=3): >>>ansible-tmp-1727204177.6462746-21951-58564694355196=/root/.ansible/tmp/ansible-tmp-1727204177.6462746-21951-58564694355196 <<< 19665 1727204177.68901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204177.68905: stdout chunk (state=3): >>><<< 19665 1727204177.68912: stderr chunk (state=3): >>><<< 19665 1727204177.68934: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204177.6462746-21951-58564694355196=/root/.ansible/tmp/ansible-tmp-1727204177.6462746-21951-58564694355196 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204177.68989: variable 'ansible_module_compression' from source: unknown 19665 1727204177.69031: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 19665 1727204177.69066: variable 'ansible_facts' from source: unknown 19665 1727204177.69141: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204177.6462746-21951-58564694355196/AnsiballZ_ping.py 19665 1727204177.69693: Sending initial data 19665 1727204177.69696: Sent initial data (152 bytes) 19665 1727204177.72353: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204177.72488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204177.72510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.72530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204177.72579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204177.72597: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204177.72617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.72635: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204177.72647: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204177.72660: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204177.72720: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204177.72736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.72754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204177.72769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204177.72783: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204177.72797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.72882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204177.72983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204177.72999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204177.73155: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204177.74831: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204177.74869: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204177.74914: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpfwvxn8ke /root/.ansible/tmp/ansible-tmp-1727204177.6462746-21951-58564694355196/AnsiballZ_ping.py <<< 19665 1727204177.74951: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204177.76361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204177.76490: stderr chunk (state=3): >>><<< 19665 1727204177.76494: stdout chunk (state=3): >>><<< 19665 1727204177.76496: done transferring module to remote 19665 1727204177.76498: _low_level_execute_command(): starting 19665 1727204177.76504: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204177.6462746-21951-58564694355196/ /root/.ansible/tmp/ansible-tmp-1727204177.6462746-21951-58564694355196/AnsiballZ_ping.py && sleep 0' 19665 1727204177.77770: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204177.77912: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204177.77932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.77951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204177.77998: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204177.78017: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204177.78038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.78057: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204177.78071: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204177.78083: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204177.78095: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204177.78107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.78149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204177.78162: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204177.78176: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204177.78189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.78300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204177.78363: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204177.78382: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204177.78564: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204177.80391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204177.80395: stdout chunk (state=3): >>><<< 19665 1727204177.80398: stderr chunk (state=3): >>><<< 19665 1727204177.80478: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204177.80482: _low_level_execute_command(): starting 19665 1727204177.80485: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204177.6462746-21951-58564694355196/AnsiballZ_ping.py && sleep 0' 19665 1727204177.81963: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204177.81984: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204177.82003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.82022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204177.82111: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204177.82124: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204177.82155: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.82176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204177.82188: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204177.82235: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204177.82255: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204177.82272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.82289: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204177.82301: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204177.82311: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204177.82325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.82494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204177.82512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204177.82527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204177.82672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204177.95842: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 19665 1727204177.96890: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204177.96980: stderr chunk (state=3): >>><<< 19665 1727204177.96984: stdout chunk (state=3): >>><<< 19665 1727204177.97121: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204177.97125: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204177.6462746-21951-58564694355196/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204177.97133: _low_level_execute_command(): starting 19665 1727204177.97135: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204177.6462746-21951-58564694355196/ > /dev/null 2>&1 && sleep 0' 19665 1727204177.98203: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.98207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204177.98232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.98236: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204177.98238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204177.98307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204177.98980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204177.98993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204177.99067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204178.00982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204178.00986: stdout chunk (state=3): >>><<< 19665 1727204178.00992: stderr chunk (state=3): >>><<< 19665 1727204178.01010: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204178.01016: handler run complete 19665 1727204178.01032: attempt loop complete, returning result 19665 1727204178.01035: _execute() done 19665 1727204178.01038: dumping result to json 19665 1727204178.01046: done dumping result, returning 19665 1727204178.01057: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-0dcc-3ea6-00000000004f] 19665 1727204178.01060: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000004f 19665 1727204178.01151: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000004f 19665 1727204178.01154: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 19665 1727204178.01221: no more pending results, returning what we have 19665 1727204178.01224: results queue empty 19665 1727204178.01226: checking for any_errors_fatal 19665 1727204178.01232: done checking for any_errors_fatal 19665 1727204178.01233: checking for max_fail_percentage 19665 1727204178.01235: done checking for max_fail_percentage 19665 1727204178.01236: checking to see if all hosts have failed and the running result is not ok 19665 1727204178.01236: done checking to see if all hosts have failed 19665 1727204178.01237: getting the remaining hosts for this loop 19665 1727204178.01239: done getting the remaining hosts for this loop 19665 1727204178.01244: getting the next task for host managed-node3 19665 1727204178.01251: done getting next task for host managed-node3 19665 1727204178.01253: ^ task is: TASK: meta (role_complete) 19665 1727204178.01255: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204178.01274: getting variables 19665 1727204178.01277: in VariableManager get_vars() 19665 1727204178.01314: Calling all_inventory to load vars for managed-node3 19665 1727204178.01316: Calling groups_inventory to load vars for managed-node3 19665 1727204178.01318: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204178.01327: Calling all_plugins_play to load vars for managed-node3 19665 1727204178.01330: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204178.01332: Calling groups_plugins_play to load vars for managed-node3 19665 1727204178.04230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204178.08141: done with get_vars() 19665 1727204178.08297: done getting variables 19665 1727204178.08467: done queuing things up, now waiting for results queue to drain 19665 1727204178.08470: results queue empty 19665 1727204178.08471: checking for any_errors_fatal 19665 1727204178.08475: done checking for any_errors_fatal 19665 1727204178.08475: checking for max_fail_percentage 19665 1727204178.08476: done checking for max_fail_percentage 19665 1727204178.08477: checking to see if all hosts have failed and the running result is not ok 19665 1727204178.08478: done checking to see if all hosts have failed 19665 1727204178.08479: getting the remaining hosts for this loop 19665 1727204178.08480: done getting the remaining hosts for this loop 19665 1727204178.08482: getting the next task for host managed-node3 19665 1727204178.08487: done getting next task for host managed-node3 19665 1727204178.08488: ^ task is: TASK: meta (flush_handlers) 19665 1727204178.08489: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204178.08492: getting variables 19665 1727204178.08493: in VariableManager get_vars() 19665 1727204178.08624: Calling all_inventory to load vars for managed-node3 19665 1727204178.08627: Calling groups_inventory to load vars for managed-node3 19665 1727204178.08629: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204178.08634: Calling all_plugins_play to load vars for managed-node3 19665 1727204178.08637: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204178.08643: Calling groups_plugins_play to load vars for managed-node3 19665 1727204178.11634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204178.13888: done with get_vars() 19665 1727204178.13993: done getting variables 19665 1727204178.14201: in VariableManager get_vars() 19665 1727204178.14216: Calling all_inventory to load vars for managed-node3 19665 1727204178.14222: Calling groups_inventory to load vars for managed-node3 19665 1727204178.14225: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204178.14230: Calling all_plugins_play to load vars for managed-node3 19665 1727204178.14233: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204178.14242: Calling groups_plugins_play to load vars for managed-node3 19665 1727204178.16580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204178.19261: done with get_vars() 19665 1727204178.19293: done queuing things up, now waiting for results queue to drain 19665 1727204178.19296: results queue empty 19665 1727204178.19296: checking for any_errors_fatal 19665 1727204178.19298: done checking for any_errors_fatal 19665 1727204178.19299: checking for max_fail_percentage 19665 1727204178.19300: done checking for max_fail_percentage 19665 1727204178.19301: checking to see if all hosts have failed and the running result is not ok 19665 1727204178.19301: done checking to see if all hosts have failed 19665 1727204178.19302: getting the remaining hosts for this loop 19665 1727204178.19303: done getting the remaining hosts for this loop 19665 1727204178.19306: getting the next task for host managed-node3 19665 1727204178.19310: done getting next task for host managed-node3 19665 1727204178.19311: ^ task is: TASK: meta (flush_handlers) 19665 1727204178.19313: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204178.19315: getting variables 19665 1727204178.19316: in VariableManager get_vars() 19665 1727204178.19447: Calling all_inventory to load vars for managed-node3 19665 1727204178.19460: Calling groups_inventory to load vars for managed-node3 19665 1727204178.19462: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204178.19476: Calling all_plugins_play to load vars for managed-node3 19665 1727204178.19479: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204178.19482: Calling groups_plugins_play to load vars for managed-node3 19665 1727204178.21870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204178.24523: done with get_vars() 19665 1727204178.24551: done getting variables 19665 1727204178.24607: in VariableManager get_vars() 19665 1727204178.24621: Calling all_inventory to load vars for managed-node3 19665 1727204178.24624: Calling groups_inventory to load vars for managed-node3 19665 1727204178.24626: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204178.24634: Calling all_plugins_play to load vars for managed-node3 19665 1727204178.24640: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204178.24644: Calling groups_plugins_play to load vars for managed-node3 19665 1727204178.26824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204178.30110: done with get_vars() 19665 1727204178.30139: done queuing things up, now waiting for results queue to drain 19665 1727204178.30141: results queue empty 19665 1727204178.30142: checking for any_errors_fatal 19665 1727204178.30143: done checking for any_errors_fatal 19665 1727204178.30144: checking for max_fail_percentage 19665 1727204178.30145: done checking for max_fail_percentage 19665 1727204178.30146: checking to see if all hosts have failed and the running result is not ok 19665 1727204178.30147: done checking to see if all hosts have failed 19665 1727204178.30147: getting the remaining hosts for this loop 19665 1727204178.30148: done getting the remaining hosts for this loop 19665 1727204178.30151: getting the next task for host managed-node3 19665 1727204178.30155: done getting next task for host managed-node3 19665 1727204178.30156: ^ task is: None 19665 1727204178.30157: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204178.30158: done queuing things up, now waiting for results queue to drain 19665 1727204178.30159: results queue empty 19665 1727204178.30160: checking for any_errors_fatal 19665 1727204178.30161: done checking for any_errors_fatal 19665 1727204178.30161: checking for max_fail_percentage 19665 1727204178.30162: done checking for max_fail_percentage 19665 1727204178.30163: checking to see if all hosts have failed and the running result is not ok 19665 1727204178.30165: done checking to see if all hosts have failed 19665 1727204178.30166: getting the next task for host managed-node3 19665 1727204178.30169: done getting next task for host managed-node3 19665 1727204178.30171: ^ task is: None 19665 1727204178.30172: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204178.30216: in VariableManager get_vars() 19665 1727204178.30233: done with get_vars() 19665 1727204178.30238: in VariableManager get_vars() 19665 1727204178.30247: done with get_vars() 19665 1727204178.30251: variable 'omit' from source: magic vars 19665 1727204178.30284: in VariableManager get_vars() 19665 1727204178.30295: done with get_vars() 19665 1727204178.30318: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 19665 1727204178.30556: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19665 1727204178.30582: getting the remaining hosts for this loop 19665 1727204178.30583: done getting the remaining hosts for this loop 19665 1727204178.30586: getting the next task for host managed-node3 19665 1727204178.30588: done getting next task for host managed-node3 19665 1727204178.30590: ^ task is: TASK: Gathering Facts 19665 1727204178.30592: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204178.30594: getting variables 19665 1727204178.30595: in VariableManager get_vars() 19665 1727204178.30603: Calling all_inventory to load vars for managed-node3 19665 1727204178.30605: Calling groups_inventory to load vars for managed-node3 19665 1727204178.30608: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204178.30613: Calling all_plugins_play to load vars for managed-node3 19665 1727204178.30615: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204178.30618: Calling groups_plugins_play to load vars for managed-node3 19665 1727204178.31898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204178.33653: done with get_vars() 19665 1727204178.33677: done getting variables 19665 1727204178.33719: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 14:56:18 -0400 (0:00:00.814) 0:00:29.204 ***** 19665 1727204178.33750: entering _queue_task() for managed-node3/gather_facts 19665 1727204178.34619: worker is 1 (out of 1 available) 19665 1727204178.34631: exiting _queue_task() for managed-node3/gather_facts 19665 1727204178.34641: done queuing things up, now waiting for results queue to drain 19665 1727204178.34643: waiting for pending results... 19665 1727204178.34981: running TaskExecutor() for managed-node3/TASK: Gathering Facts 19665 1727204178.35095: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000382 19665 1727204178.35116: variable 'ansible_search_path' from source: unknown 19665 1727204178.35155: calling self._execute() 19665 1727204178.35252: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204178.35265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204178.35278: variable 'omit' from source: magic vars 19665 1727204178.35666: variable 'ansible_distribution_major_version' from source: facts 19665 1727204178.35688: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204178.35701: variable 'omit' from source: magic vars 19665 1727204178.35735: variable 'omit' from source: magic vars 19665 1727204178.35779: variable 'omit' from source: magic vars 19665 1727204178.35821: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204178.35860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204178.35890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204178.35912: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204178.35927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204178.35959: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204178.35971: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204178.35979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204178.36085: Set connection var ansible_connection to ssh 19665 1727204178.36097: Set connection var ansible_shell_type to sh 19665 1727204178.36107: Set connection var ansible_timeout to 10 19665 1727204178.36116: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204178.36127: Set connection var ansible_shell_executable to /bin/sh 19665 1727204178.36138: Set connection var ansible_pipelining to False 19665 1727204178.36166: variable 'ansible_shell_executable' from source: unknown 19665 1727204178.36175: variable 'ansible_connection' from source: unknown 19665 1727204178.36181: variable 'ansible_module_compression' from source: unknown 19665 1727204178.36191: variable 'ansible_shell_type' from source: unknown 19665 1727204178.36198: variable 'ansible_shell_executable' from source: unknown 19665 1727204178.36204: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204178.36210: variable 'ansible_pipelining' from source: unknown 19665 1727204178.36216: variable 'ansible_timeout' from source: unknown 19665 1727204178.36222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204178.36401: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204178.36421: variable 'omit' from source: magic vars 19665 1727204178.36430: starting attempt loop 19665 1727204178.36437: running the handler 19665 1727204178.36455: variable 'ansible_facts' from source: unknown 19665 1727204178.36480: _low_level_execute_command(): starting 19665 1727204178.36491: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204178.39362: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204178.39386: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204178.39504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204178.39597: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204178.40319: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204178.40332: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204178.40350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204178.40373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204178.40386: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204178.40399: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204178.40419: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204178.40435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204178.40475: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204178.40602: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204178.40625: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204178.40688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204178.40789: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204178.40853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204178.40877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204178.40955: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204178.42592: stdout chunk (state=3): >>>/root <<< 19665 1727204178.42798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204178.42802: stdout chunk (state=3): >>><<< 19665 1727204178.42804: stderr chunk (state=3): >>><<< 19665 1727204178.42873: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204178.42879: _low_level_execute_command(): starting 19665 1727204178.42882: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204178.4282534-21975-35990216628682 `" && echo ansible-tmp-1727204178.4282534-21975-35990216628682="` echo /root/.ansible/tmp/ansible-tmp-1727204178.4282534-21975-35990216628682 `" ) && sleep 0' 19665 1727204178.44429: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204178.44605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204178.44646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204178.44781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204178.44922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204178.44936: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204178.44957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204178.44978: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204178.45002: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204178.45038: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204178.45058: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204178.45091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204178.45192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204178.45277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204178.45308: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204178.45322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204178.45413: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204178.45520: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204178.45568: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204178.45677: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204178.47589: stdout chunk (state=3): >>>ansible-tmp-1727204178.4282534-21975-35990216628682=/root/.ansible/tmp/ansible-tmp-1727204178.4282534-21975-35990216628682 <<< 19665 1727204178.47706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204178.47804: stderr chunk (state=3): >>><<< 19665 1727204178.47827: stdout chunk (state=3): >>><<< 19665 1727204178.48188: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204178.4282534-21975-35990216628682=/root/.ansible/tmp/ansible-tmp-1727204178.4282534-21975-35990216628682 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204178.48192: variable 'ansible_module_compression' from source: unknown 19665 1727204178.48195: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19665 1727204178.48197: variable 'ansible_facts' from source: unknown 19665 1727204178.48262: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204178.4282534-21975-35990216628682/AnsiballZ_setup.py 19665 1727204178.48732: Sending initial data 19665 1727204178.48735: Sent initial data (153 bytes) 19665 1727204178.51812: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204178.52123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204178.52320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204178.52335: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204178.52349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204178.52352: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204178.52431: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204178.52434: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204178.52437: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204178.52442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204178.52445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204178.52483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204178.52487: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204178.52520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204178.52528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204178.52687: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204178.52712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204178.52952: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204178.54763: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 19665 1727204178.54769: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204178.54795: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204178.54840: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp5rhi6yjx /root/.ansible/tmp/ansible-tmp-1727204178.4282534-21975-35990216628682/AnsiballZ_setup.py <<< 19665 1727204178.54899: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204178.58512: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204178.58650: stderr chunk (state=3): >>><<< 19665 1727204178.58654: stdout chunk (state=3): >>><<< 19665 1727204178.58656: done transferring module to remote 19665 1727204178.58659: _low_level_execute_command(): starting 19665 1727204178.58672: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204178.4282534-21975-35990216628682/ /root/.ansible/tmp/ansible-tmp-1727204178.4282534-21975-35990216628682/AnsiballZ_setup.py && sleep 0' 19665 1727204178.60562: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204178.60642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204178.60705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204178.60810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204178.60884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204178.60936: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204178.61001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204178.61130: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204178.61181: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204178.61209: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204178.61235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204178.61300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204178.61371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204178.61437: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204178.61454: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204178.61469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204178.61558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204178.61618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204178.61636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204178.61719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204178.63586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204178.63625: stderr chunk (state=3): >>><<< 19665 1727204178.63628: stdout chunk (state=3): >>><<< 19665 1727204178.63728: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204178.63731: _low_level_execute_command(): starting 19665 1727204178.63734: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204178.4282534-21975-35990216628682/AnsiballZ_setup.py && sleep 0' 19665 1727204178.65090: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204178.65220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204178.65235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204178.65249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204178.65291: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204178.65298: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204178.65308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204178.65326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204178.65434: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204178.65444: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204178.65456: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204178.65466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204178.65482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204178.65492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204178.65499: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204178.65509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204178.65591: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204178.65610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204178.65656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204178.65766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204179.15659: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_fips": false, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "18", "epoch": "1727204178", "epoch_int": "1727204178", "date": "2024-09-24", "time": "14:56:18", "iso8601_micro": "2024-09-24T18:56:18.896410Z", "iso8601": "2024-09-24T18:56:18Z", "iso8601_basic": "20240924T145618896410", "iso8601_basic_short": "20240924T145618", "tz": "EDT", "tz_dst": "EDT", "tz_offset": <<< 19665 1727204179.15697: stdout chunk (state=3): >>>"-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2810, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 722, "free": 2810}, "nocache": {"free": 3269, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 524, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282120192, "block_size": 4096, "block_total": 65519355, "block_available": 64522002, "block_used": 997353, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.26, "5m": 0.32, "15m": 0.16}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19665 1727204179.17408: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204179.17412: stdout chunk (state=3): >>><<< 19665 1727204179.17415: stderr chunk (state=3): >>><<< 19665 1727204179.17674: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_fips": false, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "18", "epoch": "1727204178", "epoch_int": "1727204178", "date": "2024-09-24", "time": "14:56:18", "iso8601_micro": "2024-09-24T18:56:18.896410Z", "iso8601": "2024-09-24T18:56:18Z", "iso8601_basic": "20240924T145618896410", "iso8601_basic_short": "20240924T145618", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_is_chroot": false, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2810, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 722, "free": 2810}, "nocache": {"free": 3269, "used": 263}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 524, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282120192, "block_size": 4096, "block_total": 65519355, "block_available": 64522002, "block_used": 997353, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.26, "5m": 0.32, "15m": 0.16}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204179.17854: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204178.4282534-21975-35990216628682/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204179.17884: _low_level_execute_command(): starting 19665 1727204179.17899: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204178.4282534-21975-35990216628682/ > /dev/null 2>&1 && sleep 0' 19665 1727204179.19670: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204179.19685: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.19699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.19757: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.19801: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.19857: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204179.19874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.19890: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204179.19901: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204179.19911: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204179.19921: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.19934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.19956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.19968: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.19977: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204179.19988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.20073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204179.20202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204179.20217: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204179.20297: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204179.22187: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204179.22191: stdout chunk (state=3): >>><<< 19665 1727204179.22195: stderr chunk (state=3): >>><<< 19665 1727204179.22376: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204179.22379: handler run complete 19665 1727204179.22382: variable 'ansible_facts' from source: unknown 19665 1727204179.22473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204179.22832: variable 'ansible_facts' from source: unknown 19665 1727204179.23046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204179.23333: attempt loop complete, returning result 19665 1727204179.23366: _execute() done 19665 1727204179.23375: dumping result to json 19665 1727204179.23424: done dumping result, returning 19665 1727204179.23477: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-0dcc-3ea6-000000000382] 19665 1727204179.23518: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000382 ok: [managed-node3] 19665 1727204179.24409: no more pending results, returning what we have 19665 1727204179.24412: results queue empty 19665 1727204179.24414: checking for any_errors_fatal 19665 1727204179.24415: done checking for any_errors_fatal 19665 1727204179.24416: checking for max_fail_percentage 19665 1727204179.24419: done checking for max_fail_percentage 19665 1727204179.24420: checking to see if all hosts have failed and the running result is not ok 19665 1727204179.24420: done checking to see if all hosts have failed 19665 1727204179.24421: getting the remaining hosts for this loop 19665 1727204179.24423: done getting the remaining hosts for this loop 19665 1727204179.24427: getting the next task for host managed-node3 19665 1727204179.24434: done getting next task for host managed-node3 19665 1727204179.24436: ^ task is: TASK: meta (flush_handlers) 19665 1727204179.24440: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204179.24445: getting variables 19665 1727204179.24446: in VariableManager get_vars() 19665 1727204179.24472: Calling all_inventory to load vars for managed-node3 19665 1727204179.24481: Calling groups_inventory to load vars for managed-node3 19665 1727204179.24485: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204179.24496: Calling all_plugins_play to load vars for managed-node3 19665 1727204179.24500: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204179.24503: Calling groups_plugins_play to load vars for managed-node3 19665 1727204179.25471: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000382 19665 1727204179.25475: WORKER PROCESS EXITING 19665 1727204179.37393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204179.39843: done with get_vars() 19665 1727204179.40079: done getting variables 19665 1727204179.40142: in VariableManager get_vars() 19665 1727204179.40153: Calling all_inventory to load vars for managed-node3 19665 1727204179.40156: Calling groups_inventory to load vars for managed-node3 19665 1727204179.40159: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204179.40166: Calling all_plugins_play to load vars for managed-node3 19665 1727204179.40168: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204179.40180: Calling groups_plugins_play to load vars for managed-node3 19665 1727204179.43342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204179.45327: done with get_vars() 19665 1727204179.45361: done queuing things up, now waiting for results queue to drain 19665 1727204179.45368: results queue empty 19665 1727204179.45369: checking for any_errors_fatal 19665 1727204179.45374: done checking for any_errors_fatal 19665 1727204179.45375: checking for max_fail_percentage 19665 1727204179.45376: done checking for max_fail_percentage 19665 1727204179.45377: checking to see if all hosts have failed and the running result is not ok 19665 1727204179.45378: done checking to see if all hosts have failed 19665 1727204179.45379: getting the remaining hosts for this loop 19665 1727204179.45380: done getting the remaining hosts for this loop 19665 1727204179.45383: getting the next task for host managed-node3 19665 1727204179.45387: done getting next task for host managed-node3 19665 1727204179.45389: ^ task is: TASK: Include the task 'delete_interface.yml' 19665 1727204179.45391: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204179.45393: getting variables 19665 1727204179.45394: in VariableManager get_vars() 19665 1727204179.45405: Calling all_inventory to load vars for managed-node3 19665 1727204179.45407: Calling groups_inventory to load vars for managed-node3 19665 1727204179.45410: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204179.45415: Calling all_plugins_play to load vars for managed-node3 19665 1727204179.45417: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204179.45420: Calling groups_plugins_play to load vars for managed-node3 19665 1727204179.46936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204179.49913: done with get_vars() 19665 1727204179.49949: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 14:56:19 -0400 (0:00:01.162) 0:00:30.366 ***** 19665 1727204179.50035: entering _queue_task() for managed-node3/include_tasks 19665 1727204179.50472: worker is 1 (out of 1 available) 19665 1727204179.50485: exiting _queue_task() for managed-node3/include_tasks 19665 1727204179.50497: done queuing things up, now waiting for results queue to drain 19665 1727204179.50499: waiting for pending results... 19665 1727204179.50789: running TaskExecutor() for managed-node3/TASK: Include the task 'delete_interface.yml' 19665 1727204179.50924: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000052 19665 1727204179.50951: variable 'ansible_search_path' from source: unknown 19665 1727204179.50995: calling self._execute() 19665 1727204179.51103: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204179.51115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204179.51133: variable 'omit' from source: magic vars 19665 1727204179.51548: variable 'ansible_distribution_major_version' from source: facts 19665 1727204179.51575: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204179.51592: _execute() done 19665 1727204179.51600: dumping result to json 19665 1727204179.51608: done dumping result, returning 19665 1727204179.51621: done running TaskExecutor() for managed-node3/TASK: Include the task 'delete_interface.yml' [0affcd87-79f5-0dcc-3ea6-000000000052] 19665 1727204179.51633: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000052 19665 1727204179.51780: no more pending results, returning what we have 19665 1727204179.51785: in VariableManager get_vars() 19665 1727204179.51837: Calling all_inventory to load vars for managed-node3 19665 1727204179.51843: Calling groups_inventory to load vars for managed-node3 19665 1727204179.51847: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204179.51861: Calling all_plugins_play to load vars for managed-node3 19665 1727204179.51867: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204179.51870: Calling groups_plugins_play to load vars for managed-node3 19665 1727204179.53014: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000052 19665 1727204179.53018: WORKER PROCESS EXITING 19665 1727204179.54055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204179.56798: done with get_vars() 19665 1727204179.56823: variable 'ansible_search_path' from source: unknown 19665 1727204179.56840: we have included files to process 19665 1727204179.56843: generating all_blocks data 19665 1727204179.56845: done generating all_blocks data 19665 1727204179.56846: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 19665 1727204179.56847: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 19665 1727204179.56852: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 19665 1727204179.57480: done processing included file 19665 1727204179.57482: iterating over new_blocks loaded from include file 19665 1727204179.57483: in VariableManager get_vars() 19665 1727204179.57499: done with get_vars() 19665 1727204179.57501: filtering new block on tags 19665 1727204179.57520: done filtering new block on tags 19665 1727204179.57523: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node3 19665 1727204179.57534: extending task lists for all hosts with included blocks 19665 1727204179.57579: done extending task lists 19665 1727204179.57580: done processing included files 19665 1727204179.57581: results queue empty 19665 1727204179.57582: checking for any_errors_fatal 19665 1727204179.57584: done checking for any_errors_fatal 19665 1727204179.57585: checking for max_fail_percentage 19665 1727204179.57586: done checking for max_fail_percentage 19665 1727204179.57587: checking to see if all hosts have failed and the running result is not ok 19665 1727204179.57588: done checking to see if all hosts have failed 19665 1727204179.57589: getting the remaining hosts for this loop 19665 1727204179.57590: done getting the remaining hosts for this loop 19665 1727204179.57593: getting the next task for host managed-node3 19665 1727204179.57597: done getting next task for host managed-node3 19665 1727204179.57600: ^ task is: TASK: Remove test interface if necessary 19665 1727204179.57603: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204179.57605: getting variables 19665 1727204179.57606: in VariableManager get_vars() 19665 1727204179.57616: Calling all_inventory to load vars for managed-node3 19665 1727204179.57619: Calling groups_inventory to load vars for managed-node3 19665 1727204179.57622: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204179.57628: Calling all_plugins_play to load vars for managed-node3 19665 1727204179.57630: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204179.57633: Calling groups_plugins_play to load vars for managed-node3 19665 1727204179.59104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204179.60929: done with get_vars() 19665 1727204179.60954: done getting variables 19665 1727204179.60996: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 14:56:19 -0400 (0:00:00.109) 0:00:30.476 ***** 19665 1727204179.61032: entering _queue_task() for managed-node3/command 19665 1727204179.61404: worker is 1 (out of 1 available) 19665 1727204179.61416: exiting _queue_task() for managed-node3/command 19665 1727204179.61428: done queuing things up, now waiting for results queue to drain 19665 1727204179.61430: waiting for pending results... 19665 1727204179.62116: running TaskExecutor() for managed-node3/TASK: Remove test interface if necessary 19665 1727204179.62774: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000393 19665 1727204179.62828: variable 'ansible_search_path' from source: unknown 19665 1727204179.62965: variable 'ansible_search_path' from source: unknown 19665 1727204179.63155: calling self._execute() 19665 1727204179.63273: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204179.63286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204179.63308: variable 'omit' from source: magic vars 19665 1727204179.64874: variable 'ansible_distribution_major_version' from source: facts 19665 1727204179.64898: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204179.64916: variable 'omit' from source: magic vars 19665 1727204179.65026: variable 'omit' from source: magic vars 19665 1727204179.65455: variable 'interface' from source: set_fact 19665 1727204179.65486: variable 'omit' from source: magic vars 19665 1727204179.65537: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204179.65730: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204179.65790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204179.65894: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204179.65914: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204179.65988: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204179.65999: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204179.66008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204179.66156: Set connection var ansible_connection to ssh 19665 1727204179.66173: Set connection var ansible_shell_type to sh 19665 1727204179.66218: Set connection var ansible_timeout to 10 19665 1727204179.66231: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204179.66250: Set connection var ansible_shell_executable to /bin/sh 19665 1727204179.66266: Set connection var ansible_pipelining to False 19665 1727204179.66297: variable 'ansible_shell_executable' from source: unknown 19665 1727204179.66311: variable 'ansible_connection' from source: unknown 19665 1727204179.66323: variable 'ansible_module_compression' from source: unknown 19665 1727204179.66332: variable 'ansible_shell_type' from source: unknown 19665 1727204179.66342: variable 'ansible_shell_executable' from source: unknown 19665 1727204179.66352: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204179.66366: variable 'ansible_pipelining' from source: unknown 19665 1727204179.66374: variable 'ansible_timeout' from source: unknown 19665 1727204179.66382: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204179.66549: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204179.66571: variable 'omit' from source: magic vars 19665 1727204179.66582: starting attempt loop 19665 1727204179.66590: running the handler 19665 1727204179.66608: _low_level_execute_command(): starting 19665 1727204179.66620: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204179.67485: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204179.67502: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.67524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.67550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.67598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.67610: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204179.67629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.67657: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204179.67673: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204179.67685: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204179.67701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.67784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.67826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.67842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.67878: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204179.67911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.68162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204179.68187: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204179.68206: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204179.68282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204179.69901: stdout chunk (state=3): >>>/root <<< 19665 1727204179.70009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204179.70153: stderr chunk (state=3): >>><<< 19665 1727204179.70156: stdout chunk (state=3): >>><<< 19665 1727204179.70204: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204179.70208: _low_level_execute_command(): starting 19665 1727204179.70211: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204179.70186-22020-71879811244606 `" && echo ansible-tmp-1727204179.70186-22020-71879811244606="` echo /root/.ansible/tmp/ansible-tmp-1727204179.70186-22020-71879811244606 `" ) && sleep 0' 19665 1727204179.70901: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204179.70909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.70919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.70933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.70974: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.70981: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204179.70992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.71004: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204179.71013: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204179.71020: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204179.71026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.71037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.71050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.71058: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.71088: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204179.71091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.71155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204179.71166: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204179.71179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204179.71259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204179.73066: stdout chunk (state=3): >>>ansible-tmp-1727204179.70186-22020-71879811244606=/root/.ansible/tmp/ansible-tmp-1727204179.70186-22020-71879811244606 <<< 19665 1727204179.73262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204179.73269: stdout chunk (state=3): >>><<< 19665 1727204179.73274: stderr chunk (state=3): >>><<< 19665 1727204179.73293: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204179.70186-22020-71879811244606=/root/.ansible/tmp/ansible-tmp-1727204179.70186-22020-71879811244606 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204179.73337: variable 'ansible_module_compression' from source: unknown 19665 1727204179.73391: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19665 1727204179.73431: variable 'ansible_facts' from source: unknown 19665 1727204179.73511: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204179.70186-22020-71879811244606/AnsiballZ_command.py 19665 1727204179.73646: Sending initial data 19665 1727204179.73650: Sent initial data (153 bytes) 19665 1727204179.74702: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204179.74706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.74712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.74753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.74769: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.74781: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204179.74786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.74797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204179.74805: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204179.74812: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204179.74820: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.74829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.74850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.74866: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.74869: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204179.74871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.75008: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204179.75012: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204179.75016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204179.75156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204179.76823: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204179.76858: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204179.76900: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp26cfrwy6 /root/.ansible/tmp/ansible-tmp-1727204179.70186-22020-71879811244606/AnsiballZ_command.py <<< 19665 1727204179.76937: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204179.78274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204179.78360: stderr chunk (state=3): >>><<< 19665 1727204179.78363: stdout chunk (state=3): >>><<< 19665 1727204179.78389: done transferring module to remote 19665 1727204179.78399: _low_level_execute_command(): starting 19665 1727204179.78407: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204179.70186-22020-71879811244606/ /root/.ansible/tmp/ansible-tmp-1727204179.70186-22020-71879811244606/AnsiballZ_command.py && sleep 0' 19665 1727204179.79052: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204179.79060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.79074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.79104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.79130: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.79135: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204179.79150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.79158: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204179.79167: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204179.79173: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204179.79181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.79189: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.79241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.79244: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.79246: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204179.79248: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.79303: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204179.79315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204179.79326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204179.79411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204179.81198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204179.81204: stdout chunk (state=3): >>><<< 19665 1727204179.81206: stderr chunk (state=3): >>><<< 19665 1727204179.81246: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204179.81250: _low_level_execute_command(): starting 19665 1727204179.81254: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204179.70186-22020-71879811244606/AnsiballZ_command.py && sleep 0' 19665 1727204179.81872: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204179.81880: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.81914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.81917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.81945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.81952: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204179.81962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.81979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204179.81986: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204179.81992: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204179.82000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.82009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.82020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.82028: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.82034: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204179.82044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.82115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204179.82129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204179.82141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204179.82214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204179.96124: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-24 14:56:19.952640", "end": "2024-09-24 14:56:19.960137", "delta": "0:00:00.007497", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19665 1727204179.97337: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.87 closed. <<< 19665 1727204179.97345: stderr chunk (state=3): >>><<< 19665 1727204179.97348: stdout chunk (state=3): >>><<< 19665 1727204179.97379: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"LSR-TST-br31\"", "rc": 1, "cmd": ["ip", "link", "del", "LSR-TST-br31"], "start": "2024-09-24 14:56:19.952640", "end": "2024-09-24 14:56:19.960137", "delta": "0:00:00.007497", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del LSR-TST-br31", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.87 closed. 19665 1727204179.97422: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204179.70186-22020-71879811244606/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204179.97429: _low_level_execute_command(): starting 19665 1727204179.97435: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204179.70186-22020-71879811244606/ > /dev/null 2>&1 && sleep 0' 19665 1727204179.98893: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204179.98911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.98921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.98935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.98973: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.98980: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204179.98990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.99005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204179.99021: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204179.99027: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204179.99035: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204179.99044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204179.99055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204179.99062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204179.99070: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204179.99079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204179.99150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204179.99167: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204179.99170: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204179.99243: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204180.01088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204180.01142: stderr chunk (state=3): >>><<< 19665 1727204180.01146: stdout chunk (state=3): >>><<< 19665 1727204180.01168: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204180.01176: handler run complete 19665 1727204180.01202: Evaluated conditional (False): False 19665 1727204180.01213: attempt loop complete, returning result 19665 1727204180.01216: _execute() done 19665 1727204180.01219: dumping result to json 19665 1727204180.01225: done dumping result, returning 19665 1727204180.01234: done running TaskExecutor() for managed-node3/TASK: Remove test interface if necessary [0affcd87-79f5-0dcc-3ea6-000000000393] 19665 1727204180.01239: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000393 19665 1727204180.01342: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000393 19665 1727204180.01344: WORKER PROCESS EXITING fatal: [managed-node3]: FAILED! => { "changed": false, "cmd": [ "ip", "link", "del", "LSR-TST-br31" ], "delta": "0:00:00.007497", "end": "2024-09-24 14:56:19.960137", "rc": 1, "start": "2024-09-24 14:56:19.952640" } STDERR: Cannot find device "LSR-TST-br31" MSG: non-zero return code ...ignoring 19665 1727204180.01412: no more pending results, returning what we have 19665 1727204180.01416: results queue empty 19665 1727204180.01417: checking for any_errors_fatal 19665 1727204180.01418: done checking for any_errors_fatal 19665 1727204180.01419: checking for max_fail_percentage 19665 1727204180.01421: done checking for max_fail_percentage 19665 1727204180.01422: checking to see if all hosts have failed and the running result is not ok 19665 1727204180.01423: done checking to see if all hosts have failed 19665 1727204180.01423: getting the remaining hosts for this loop 19665 1727204180.01425: done getting the remaining hosts for this loop 19665 1727204180.01429: getting the next task for host managed-node3 19665 1727204180.01437: done getting next task for host managed-node3 19665 1727204180.01439: ^ task is: TASK: meta (flush_handlers) 19665 1727204180.01441: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204180.01445: getting variables 19665 1727204180.01446: in VariableManager get_vars() 19665 1727204180.01478: Calling all_inventory to load vars for managed-node3 19665 1727204180.01481: Calling groups_inventory to load vars for managed-node3 19665 1727204180.01485: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204180.01497: Calling all_plugins_play to load vars for managed-node3 19665 1727204180.01500: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204180.01502: Calling groups_plugins_play to load vars for managed-node3 19665 1727204180.05189: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204180.09952: done with get_vars() 19665 1727204180.09987: done getting variables 19665 1727204180.10871: in VariableManager get_vars() 19665 1727204180.10885: Calling all_inventory to load vars for managed-node3 19665 1727204180.10888: Calling groups_inventory to load vars for managed-node3 19665 1727204180.10891: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204180.10897: Calling all_plugins_play to load vars for managed-node3 19665 1727204180.10900: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204180.10911: Calling groups_plugins_play to load vars for managed-node3 19665 1727204180.13794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204180.17487: done with get_vars() 19665 1727204180.17524: done queuing things up, now waiting for results queue to drain 19665 1727204180.17527: results queue empty 19665 1727204180.17527: checking for any_errors_fatal 19665 1727204180.17531: done checking for any_errors_fatal 19665 1727204180.17532: checking for max_fail_percentage 19665 1727204180.17533: done checking for max_fail_percentage 19665 1727204180.17534: checking to see if all hosts have failed and the running result is not ok 19665 1727204180.17535: done checking to see if all hosts have failed 19665 1727204180.17535: getting the remaining hosts for this loop 19665 1727204180.17536: done getting the remaining hosts for this loop 19665 1727204180.17542: getting the next task for host managed-node3 19665 1727204180.17545: done getting next task for host managed-node3 19665 1727204180.17547: ^ task is: TASK: meta (flush_handlers) 19665 1727204180.17548: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204180.17551: getting variables 19665 1727204180.17552: in VariableManager get_vars() 19665 1727204180.17562: Calling all_inventory to load vars for managed-node3 19665 1727204180.17566: Calling groups_inventory to load vars for managed-node3 19665 1727204180.17569: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204180.17574: Calling all_plugins_play to load vars for managed-node3 19665 1727204180.17576: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204180.17579: Calling groups_plugins_play to load vars for managed-node3 19665 1727204180.19914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204180.23035: done with get_vars() 19665 1727204180.23073: done getting variables 19665 1727204180.23127: in VariableManager get_vars() 19665 1727204180.23141: Calling all_inventory to load vars for managed-node3 19665 1727204180.23144: Calling groups_inventory to load vars for managed-node3 19665 1727204180.23146: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204180.23151: Calling all_plugins_play to load vars for managed-node3 19665 1727204180.23154: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204180.23157: Calling groups_plugins_play to load vars for managed-node3 19665 1727204180.25966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204180.29890: done with get_vars() 19665 1727204180.29954: done queuing things up, now waiting for results queue to drain 19665 1727204180.29957: results queue empty 19665 1727204180.29958: checking for any_errors_fatal 19665 1727204180.29959: done checking for any_errors_fatal 19665 1727204180.29960: checking for max_fail_percentage 19665 1727204180.29961: done checking for max_fail_percentage 19665 1727204180.29962: checking to see if all hosts have failed and the running result is not ok 19665 1727204180.29963: done checking to see if all hosts have failed 19665 1727204180.29966: getting the remaining hosts for this loop 19665 1727204180.29968: done getting the remaining hosts for this loop 19665 1727204180.29971: getting the next task for host managed-node3 19665 1727204180.29974: done getting next task for host managed-node3 19665 1727204180.29975: ^ task is: None 19665 1727204180.29979: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204180.29980: done queuing things up, now waiting for results queue to drain 19665 1727204180.29981: results queue empty 19665 1727204180.29981: checking for any_errors_fatal 19665 1727204180.29982: done checking for any_errors_fatal 19665 1727204180.29983: checking for max_fail_percentage 19665 1727204180.30040: done checking for max_fail_percentage 19665 1727204180.30042: checking to see if all hosts have failed and the running result is not ok 19665 1727204180.30042: done checking to see if all hosts have failed 19665 1727204180.30044: getting the next task for host managed-node3 19665 1727204180.30047: done getting next task for host managed-node3 19665 1727204180.30048: ^ task is: None 19665 1727204180.30049: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204180.30992: in VariableManager get_vars() 19665 1727204180.31016: done with get_vars() 19665 1727204180.31023: in VariableManager get_vars() 19665 1727204180.31036: done with get_vars() 19665 1727204180.31040: variable 'omit' from source: magic vars 19665 1727204180.31170: variable 'profile' from source: play vars 19665 1727204180.31288: in VariableManager get_vars() 19665 1727204180.31303: done with get_vars() 19665 1727204180.31329: variable 'omit' from source: magic vars 19665 1727204180.31406: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 19665 1727204180.32707: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19665 1727204180.32914: getting the remaining hosts for this loop 19665 1727204180.32916: done getting the remaining hosts for this loop 19665 1727204180.32919: getting the next task for host managed-node3 19665 1727204180.32921: done getting next task for host managed-node3 19665 1727204180.32923: ^ task is: TASK: Gathering Facts 19665 1727204180.32925: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204180.32927: getting variables 19665 1727204180.32927: in VariableManager get_vars() 19665 1727204180.32942: Calling all_inventory to load vars for managed-node3 19665 1727204180.32944: Calling groups_inventory to load vars for managed-node3 19665 1727204180.32948: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204180.32954: Calling all_plugins_play to load vars for managed-node3 19665 1727204180.32956: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204180.32959: Calling groups_plugins_play to load vars for managed-node3 19665 1727204180.35898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204180.38062: done with get_vars() 19665 1727204180.38101: done getting variables 19665 1727204180.38153: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 14:56:20 -0400 (0:00:00.771) 0:00:31.248 ***** 19665 1727204180.38184: entering _queue_task() for managed-node3/gather_facts 19665 1727204180.38878: worker is 1 (out of 1 available) 19665 1727204180.38888: exiting _queue_task() for managed-node3/gather_facts 19665 1727204180.38899: done queuing things up, now waiting for results queue to drain 19665 1727204180.38901: waiting for pending results... 19665 1727204180.39927: running TaskExecutor() for managed-node3/TASK: Gathering Facts 19665 1727204180.40085: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000003a1 19665 1727204180.40105: variable 'ansible_search_path' from source: unknown 19665 1727204180.40148: calling self._execute() 19665 1727204180.40267: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204180.40284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204180.40308: variable 'omit' from source: magic vars 19665 1727204180.40754: variable 'ansible_distribution_major_version' from source: facts 19665 1727204180.40777: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204180.40788: variable 'omit' from source: magic vars 19665 1727204180.40828: variable 'omit' from source: magic vars 19665 1727204180.40880: variable 'omit' from source: magic vars 19665 1727204180.40928: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204180.40982: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204180.41010: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204180.41033: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204180.41067: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204180.41101: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204180.41110: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204180.41117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204180.41232: Set connection var ansible_connection to ssh 19665 1727204180.41248: Set connection var ansible_shell_type to sh 19665 1727204180.41265: Set connection var ansible_timeout to 10 19665 1727204180.41285: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204180.41298: Set connection var ansible_shell_executable to /bin/sh 19665 1727204180.41309: Set connection var ansible_pipelining to False 19665 1727204180.41335: variable 'ansible_shell_executable' from source: unknown 19665 1727204180.41346: variable 'ansible_connection' from source: unknown 19665 1727204180.41354: variable 'ansible_module_compression' from source: unknown 19665 1727204180.41360: variable 'ansible_shell_type' from source: unknown 19665 1727204180.41372: variable 'ansible_shell_executable' from source: unknown 19665 1727204180.41386: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204180.41393: variable 'ansible_pipelining' from source: unknown 19665 1727204180.41399: variable 'ansible_timeout' from source: unknown 19665 1727204180.41406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204180.41606: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204180.41623: variable 'omit' from source: magic vars 19665 1727204180.41631: starting attempt loop 19665 1727204180.41637: running the handler 19665 1727204180.41659: variable 'ansible_facts' from source: unknown 19665 1727204180.41686: _low_level_execute_command(): starting 19665 1727204180.41706: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204180.44200: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204180.44270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204180.44377: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204180.44498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204180.44646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204180.44668: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204180.44683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204180.44706: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204180.44727: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204180.44765: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204180.44834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204180.44919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204180.44954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204180.44979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204180.45104: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204180.45168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204180.45342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204180.45422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204180.45472: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204180.45645: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204180.47192: stdout chunk (state=3): >>>/root <<< 19665 1727204180.47296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204180.47359: stderr chunk (state=3): >>><<< 19665 1727204180.47362: stdout chunk (state=3): >>><<< 19665 1727204180.47429: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204180.47433: _low_level_execute_command(): starting 19665 1727204180.47436: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204180.4738038-22057-113236210484259 `" && echo ansible-tmp-1727204180.4738038-22057-113236210484259="` echo /root/.ansible/tmp/ansible-tmp-1727204180.4738038-22057-113236210484259 `" ) && sleep 0' 19665 1727204180.47862: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204180.47866: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204180.47917: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204180.47921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204180.47923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204180.47979: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204180.47983: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204180.48035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204180.49903: stdout chunk (state=3): >>>ansible-tmp-1727204180.4738038-22057-113236210484259=/root/.ansible/tmp/ansible-tmp-1727204180.4738038-22057-113236210484259 <<< 19665 1727204180.50009: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204180.50069: stderr chunk (state=3): >>><<< 19665 1727204180.50095: stdout chunk (state=3): >>><<< 19665 1727204180.50123: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204180.4738038-22057-113236210484259=/root/.ansible/tmp/ansible-tmp-1727204180.4738038-22057-113236210484259 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204180.50145: variable 'ansible_module_compression' from source: unknown 19665 1727204180.50223: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19665 1727204180.50320: variable 'ansible_facts' from source: unknown 19665 1727204180.50546: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204180.4738038-22057-113236210484259/AnsiballZ_setup.py 19665 1727204180.50947: Sending initial data 19665 1727204180.50950: Sent initial data (154 bytes) 19665 1727204180.52130: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204180.52146: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204180.52165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204180.52182: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204180.52213: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204180.52221: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204180.52230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204180.52247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204180.52256: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204180.52271: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204180.52290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204180.52292: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204180.52345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204180.52348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204180.52411: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204180.54164: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 19665 1727204180.54182: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 19665 1727204180.54197: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 19665 1727204180.54209: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 19665 1727204180.54224: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 19665 1727204180.54235: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 19665 1727204180.54249: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 19665 1727204180.54260: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 19665 1727204180.54273: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204180.54343: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 19665 1727204180.54375: stderr chunk (state=3): >>>debug1: Using server upload size 261120 <<< 19665 1727204180.54391: stderr chunk (state=3): >>>debug1: Server handle limit 1019; using 64 <<< 19665 1727204180.54449: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp5heggi7q /root/.ansible/tmp/ansible-tmp-1727204180.4738038-22057-113236210484259/AnsiballZ_setup.py <<< 19665 1727204180.54503: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204180.56902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204180.57118: stderr chunk (state=3): >>><<< 19665 1727204180.57122: stdout chunk (state=3): >>><<< 19665 1727204180.57125: done transferring module to remote 19665 1727204180.57127: _low_level_execute_command(): starting 19665 1727204180.57130: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204180.4738038-22057-113236210484259/ /root/.ansible/tmp/ansible-tmp-1727204180.4738038-22057-113236210484259/AnsiballZ_setup.py && sleep 0' 19665 1727204180.57928: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204180.57951: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204180.57977: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204180.58009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204180.58060: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204180.58080: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204180.58108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204180.58129: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204180.58146: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204180.58167: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204180.58182: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204180.58203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204180.58231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204180.58247: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204180.58266: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204180.58285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204180.58382: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204180.58410: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204180.58436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204180.58512: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204180.60281: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204180.60375: stderr chunk (state=3): >>><<< 19665 1727204180.60378: stdout chunk (state=3): >>><<< 19665 1727204180.60477: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204180.60483: _low_level_execute_command(): starting 19665 1727204180.60488: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204180.4738038-22057-113236210484259/AnsiballZ_setup.py && sleep 0' 19665 1727204180.61087: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204180.61098: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204180.61124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204180.61127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204180.61192: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204180.61195: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204180.61197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204180.61202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204180.61249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204180.61284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204180.61287: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204180.61344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204181.11176: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "20", "epoch": "1727204180", "epoch_int": "1727204180", "date": "2024-09-24", "time": "14:56:20", "iso8601_micro": "2024-09-24T18:56:20.843829Z", "iso8601": "2024-09-24T18:56:20Z", "iso8601_basic": "20240924T145620843829", "iso8601_basic_short": "20240924T145620", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.32, "5m": 0.33, "15m": 0.17}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixe<<< 19665 1727204181.11214: stdout chunk (state=3): >>>d]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2817, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 715, "free": 2817}, "nocache": {"free": 3276, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 526, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282120192, "block_size": 4096, "block_total": 65519355, "block_available": 64522002, "block_used": 997353, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19665 1727204181.12927: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204181.13010: stderr chunk (state=3): >>><<< 19665 1727204181.13013: stdout chunk (state=3): >>><<< 19665 1727204181.13176: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "20", "epoch": "1727204180", "epoch_int": "1727204180", "date": "2024-09-24", "time": "14:56:20", "iso8601_micro": "2024-09-24T18:56:20.843829Z", "iso8601": "2024-09-24T18:56:20Z", "iso8601_basic": "20240924T145620843829", "iso8601_basic_short": "20240924T145620", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_loadavg": {"1m": 0.32, "5m": 0.33, "15m": 0.17}, "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2817, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 715, "free": 2817}, "nocache": {"free": 3276, "used": 256}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 526, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282120192, "block_size": 4096, "block_total": 65519355, "block_available": 64522002, "block_used": 997353, "inode_total": 131071472, "inode_available": 130998311, "inode_used": 73161, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204181.13435: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204180.4738038-22057-113236210484259/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204181.13470: _low_level_execute_command(): starting 19665 1727204181.13481: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204180.4738038-22057-113236210484259/ > /dev/null 2>&1 && sleep 0' 19665 1727204181.16195: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204181.16199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204181.16236: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204181.16243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204181.16245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204181.16296: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204181.16621: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204181.16638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204181.16758: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204181.18688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204181.18692: stdout chunk (state=3): >>><<< 19665 1727204181.18695: stderr chunk (state=3): >>><<< 19665 1727204181.19070: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204181.19074: handler run complete 19665 1727204181.19077: variable 'ansible_facts' from source: unknown 19665 1727204181.19079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204181.19762: variable 'ansible_facts' from source: unknown 19665 1727204181.19853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204181.20222: attempt loop complete, returning result 19665 1727204181.20233: _execute() done 19665 1727204181.20240: dumping result to json 19665 1727204181.20276: done dumping result, returning 19665 1727204181.20482: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-0dcc-3ea6-0000000003a1] 19665 1727204181.20493: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000003a1 ok: [managed-node3] 19665 1727204181.21778: no more pending results, returning what we have 19665 1727204181.21782: results queue empty 19665 1727204181.21783: checking for any_errors_fatal 19665 1727204181.21785: done checking for any_errors_fatal 19665 1727204181.21786: checking for max_fail_percentage 19665 1727204181.21787: done checking for max_fail_percentage 19665 1727204181.21788: checking to see if all hosts have failed and the running result is not ok 19665 1727204181.21789: done checking to see if all hosts have failed 19665 1727204181.21790: getting the remaining hosts for this loop 19665 1727204181.21792: done getting the remaining hosts for this loop 19665 1727204181.21796: getting the next task for host managed-node3 19665 1727204181.21803: done getting next task for host managed-node3 19665 1727204181.21805: ^ task is: TASK: meta (flush_handlers) 19665 1727204181.21807: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204181.21811: getting variables 19665 1727204181.21812: in VariableManager get_vars() 19665 1727204181.21845: Calling all_inventory to load vars for managed-node3 19665 1727204181.21848: Calling groups_inventory to load vars for managed-node3 19665 1727204181.21850: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204181.21862: Calling all_plugins_play to load vars for managed-node3 19665 1727204181.21866: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204181.21870: Calling groups_plugins_play to load vars for managed-node3 19665 1727204181.22686: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000003a1 19665 1727204181.22689: WORKER PROCESS EXITING 19665 1727204181.24459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204181.27147: done with get_vars() 19665 1727204181.27181: done getting variables 19665 1727204181.27256: in VariableManager get_vars() 19665 1727204181.27274: Calling all_inventory to load vars for managed-node3 19665 1727204181.27277: Calling groups_inventory to load vars for managed-node3 19665 1727204181.27279: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204181.27284: Calling all_plugins_play to load vars for managed-node3 19665 1727204181.27287: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204181.27290: Calling groups_plugins_play to load vars for managed-node3 19665 1727204181.29457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204181.31322: done with get_vars() 19665 1727204181.31351: done queuing things up, now waiting for results queue to drain 19665 1727204181.31354: results queue empty 19665 1727204181.31355: checking for any_errors_fatal 19665 1727204181.31364: done checking for any_errors_fatal 19665 1727204181.31365: checking for max_fail_percentage 19665 1727204181.31366: done checking for max_fail_percentage 19665 1727204181.31373: checking to see if all hosts have failed and the running result is not ok 19665 1727204181.31374: done checking to see if all hosts have failed 19665 1727204181.31375: getting the remaining hosts for this loop 19665 1727204181.31376: done getting the remaining hosts for this loop 19665 1727204181.31380: getting the next task for host managed-node3 19665 1727204181.31384: done getting next task for host managed-node3 19665 1727204181.31387: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19665 1727204181.31388: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204181.31398: getting variables 19665 1727204181.31399: in VariableManager get_vars() 19665 1727204181.31416: Calling all_inventory to load vars for managed-node3 19665 1727204181.31418: Calling groups_inventory to load vars for managed-node3 19665 1727204181.31420: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204181.31426: Calling all_plugins_play to load vars for managed-node3 19665 1727204181.31428: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204181.31431: Calling groups_plugins_play to load vars for managed-node3 19665 1727204181.32677: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204181.34280: done with get_vars() 19665 1727204181.34422: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:56:21 -0400 (0:00:00.963) 0:00:32.211 ***** 19665 1727204181.34616: entering _queue_task() for managed-node3/include_tasks 19665 1727204181.35304: worker is 1 (out of 1 available) 19665 1727204181.35316: exiting _queue_task() for managed-node3/include_tasks 19665 1727204181.35330: done queuing things up, now waiting for results queue to drain 19665 1727204181.35332: waiting for pending results... 19665 1727204181.36361: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 19665 1727204181.36792: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000005a 19665 1727204181.36815: variable 'ansible_search_path' from source: unknown 19665 1727204181.36824: variable 'ansible_search_path' from source: unknown 19665 1727204181.36994: calling self._execute() 19665 1727204181.37183: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204181.37198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204181.37219: variable 'omit' from source: magic vars 19665 1727204181.37659: variable 'ansible_distribution_major_version' from source: facts 19665 1727204181.37682: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204181.37694: _execute() done 19665 1727204181.37702: dumping result to json 19665 1727204181.37710: done dumping result, returning 19665 1727204181.37723: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [0affcd87-79f5-0dcc-3ea6-00000000005a] 19665 1727204181.37732: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000005a 19665 1727204181.37888: no more pending results, returning what we have 19665 1727204181.37894: in VariableManager get_vars() 19665 1727204181.37938: Calling all_inventory to load vars for managed-node3 19665 1727204181.37940: Calling groups_inventory to load vars for managed-node3 19665 1727204181.37943: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204181.37957: Calling all_plugins_play to load vars for managed-node3 19665 1727204181.37960: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204181.37967: Calling groups_plugins_play to load vars for managed-node3 19665 1727204181.39171: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000005a 19665 1727204181.39175: WORKER PROCESS EXITING 19665 1727204181.40895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204181.43994: done with get_vars() 19665 1727204181.44029: variable 'ansible_search_path' from source: unknown 19665 1727204181.44030: variable 'ansible_search_path' from source: unknown 19665 1727204181.44060: we have included files to process 19665 1727204181.44061: generating all_blocks data 19665 1727204181.44063: done generating all_blocks data 19665 1727204181.44065: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19665 1727204181.44066: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19665 1727204181.44068: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 19665 1727204181.44648: done processing included file 19665 1727204181.44650: iterating over new_blocks loaded from include file 19665 1727204181.44652: in VariableManager get_vars() 19665 1727204181.44677: done with get_vars() 19665 1727204181.44679: filtering new block on tags 19665 1727204181.44697: done filtering new block on tags 19665 1727204181.44700: in VariableManager get_vars() 19665 1727204181.44722: done with get_vars() 19665 1727204181.44724: filtering new block on tags 19665 1727204181.44743: done filtering new block on tags 19665 1727204181.44746: in VariableManager get_vars() 19665 1727204181.44767: done with get_vars() 19665 1727204181.44768: filtering new block on tags 19665 1727204181.44785: done filtering new block on tags 19665 1727204181.44787: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 19665 1727204181.44793: extending task lists for all hosts with included blocks 19665 1727204181.46166: done extending task lists 19665 1727204181.46168: done processing included files 19665 1727204181.46169: results queue empty 19665 1727204181.46170: checking for any_errors_fatal 19665 1727204181.46172: done checking for any_errors_fatal 19665 1727204181.46173: checking for max_fail_percentage 19665 1727204181.46174: done checking for max_fail_percentage 19665 1727204181.46175: checking to see if all hosts have failed and the running result is not ok 19665 1727204181.46176: done checking to see if all hosts have failed 19665 1727204181.46176: getting the remaining hosts for this loop 19665 1727204181.46178: done getting the remaining hosts for this loop 19665 1727204181.46181: getting the next task for host managed-node3 19665 1727204181.46185: done getting next task for host managed-node3 19665 1727204181.46188: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19665 1727204181.46191: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204181.46201: getting variables 19665 1727204181.46202: in VariableManager get_vars() 19665 1727204181.46222: Calling all_inventory to load vars for managed-node3 19665 1727204181.46225: Calling groups_inventory to load vars for managed-node3 19665 1727204181.46227: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204181.46233: Calling all_plugins_play to load vars for managed-node3 19665 1727204181.46236: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204181.46239: Calling groups_plugins_play to load vars for managed-node3 19665 1727204181.48980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204181.51813: done with get_vars() 19665 1727204181.51849: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:56:21 -0400 (0:00:00.173) 0:00:32.385 ***** 19665 1727204181.51934: entering _queue_task() for managed-node3/setup 19665 1727204181.52940: worker is 1 (out of 1 available) 19665 1727204181.52953: exiting _queue_task() for managed-node3/setup 19665 1727204181.52968: done queuing things up, now waiting for results queue to drain 19665 1727204181.52970: waiting for pending results... 19665 1727204181.54073: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 19665 1727204181.54444: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000003e2 19665 1727204181.54468: variable 'ansible_search_path' from source: unknown 19665 1727204181.54477: variable 'ansible_search_path' from source: unknown 19665 1727204181.54521: calling self._execute() 19665 1727204181.54668: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204181.54683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204181.54699: variable 'omit' from source: magic vars 19665 1727204181.55114: variable 'ansible_distribution_major_version' from source: facts 19665 1727204181.55134: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204181.55380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204181.58554: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204181.58744: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204181.58833: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204181.58887: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204181.58930: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204181.59033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204181.59076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204181.59115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204181.59168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204181.59190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204181.59257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204181.59287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204181.59316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204181.59371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204181.59391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204181.59585: variable '__network_required_facts' from source: role '' defaults 19665 1727204181.59600: variable 'ansible_facts' from source: unknown 19665 1727204181.60420: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 19665 1727204181.60429: when evaluation is False, skipping this task 19665 1727204181.60436: _execute() done 19665 1727204181.60446: dumping result to json 19665 1727204181.60453: done dumping result, returning 19665 1727204181.60468: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [0affcd87-79f5-0dcc-3ea6-0000000003e2] 19665 1727204181.60478: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000003e2 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204181.60620: no more pending results, returning what we have 19665 1727204181.60625: results queue empty 19665 1727204181.60626: checking for any_errors_fatal 19665 1727204181.60628: done checking for any_errors_fatal 19665 1727204181.60629: checking for max_fail_percentage 19665 1727204181.60631: done checking for max_fail_percentage 19665 1727204181.60632: checking to see if all hosts have failed and the running result is not ok 19665 1727204181.60633: done checking to see if all hosts have failed 19665 1727204181.60634: getting the remaining hosts for this loop 19665 1727204181.60635: done getting the remaining hosts for this loop 19665 1727204181.60643: getting the next task for host managed-node3 19665 1727204181.60654: done getting next task for host managed-node3 19665 1727204181.60658: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 19665 1727204181.60662: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204181.60679: getting variables 19665 1727204181.60682: in VariableManager get_vars() 19665 1727204181.60725: Calling all_inventory to load vars for managed-node3 19665 1727204181.60728: Calling groups_inventory to load vars for managed-node3 19665 1727204181.60731: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204181.60745: Calling all_plugins_play to load vars for managed-node3 19665 1727204181.60749: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204181.60753: Calling groups_plugins_play to load vars for managed-node3 19665 1727204181.62495: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000003e2 19665 1727204181.62499: WORKER PROCESS EXITING 19665 1727204181.63266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204181.65029: done with get_vars() 19665 1727204181.65069: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:56:21 -0400 (0:00:00.132) 0:00:32.518 ***** 19665 1727204181.65207: entering _queue_task() for managed-node3/stat 19665 1727204181.65580: worker is 1 (out of 1 available) 19665 1727204181.65592: exiting _queue_task() for managed-node3/stat 19665 1727204181.65604: done queuing things up, now waiting for results queue to drain 19665 1727204181.65606: waiting for pending results... 19665 1727204181.65884: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 19665 1727204181.66030: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000003e4 19665 1727204181.66058: variable 'ansible_search_path' from source: unknown 19665 1727204181.66068: variable 'ansible_search_path' from source: unknown 19665 1727204181.66111: calling self._execute() 19665 1727204181.66217: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204181.66229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204181.66246: variable 'omit' from source: magic vars 19665 1727204181.66642: variable 'ansible_distribution_major_version' from source: facts 19665 1727204181.66662: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204181.66844: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204181.67282: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204181.67332: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204181.67378: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204181.67415: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204181.67513: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204181.67543: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204181.67586: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204181.67618: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204181.67721: variable '__network_is_ostree' from source: set_fact 19665 1727204181.67732: Evaluated conditional (not __network_is_ostree is defined): False 19665 1727204181.67741: when evaluation is False, skipping this task 19665 1727204181.67747: _execute() done 19665 1727204181.67754: dumping result to json 19665 1727204181.67760: done dumping result, returning 19665 1727204181.67777: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [0affcd87-79f5-0dcc-3ea6-0000000003e4] 19665 1727204181.67786: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000003e4 19665 1727204181.67900: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000003e4 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19665 1727204181.67965: no more pending results, returning what we have 19665 1727204181.67971: results queue empty 19665 1727204181.67972: checking for any_errors_fatal 19665 1727204181.67980: done checking for any_errors_fatal 19665 1727204181.67981: checking for max_fail_percentage 19665 1727204181.67983: done checking for max_fail_percentage 19665 1727204181.67984: checking to see if all hosts have failed and the running result is not ok 19665 1727204181.67985: done checking to see if all hosts have failed 19665 1727204181.67986: getting the remaining hosts for this loop 19665 1727204181.67988: done getting the remaining hosts for this loop 19665 1727204181.67992: getting the next task for host managed-node3 19665 1727204181.68000: done getting next task for host managed-node3 19665 1727204181.68005: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19665 1727204181.68008: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204181.68024: getting variables 19665 1727204181.68026: in VariableManager get_vars() 19665 1727204181.68075: Calling all_inventory to load vars for managed-node3 19665 1727204181.68078: Calling groups_inventory to load vars for managed-node3 19665 1727204181.68080: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204181.68091: Calling all_plugins_play to load vars for managed-node3 19665 1727204181.68095: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204181.68098: Calling groups_plugins_play to load vars for managed-node3 19665 1727204181.69427: WORKER PROCESS EXITING 19665 1727204181.70429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204181.72890: done with get_vars() 19665 1727204181.72917: done getting variables 19665 1727204181.72979: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:56:21 -0400 (0:00:00.078) 0:00:32.596 ***** 19665 1727204181.73013: entering _queue_task() for managed-node3/set_fact 19665 1727204181.73355: worker is 1 (out of 1 available) 19665 1727204181.73368: exiting _queue_task() for managed-node3/set_fact 19665 1727204181.73380: done queuing things up, now waiting for results queue to drain 19665 1727204181.73382: waiting for pending results... 19665 1727204181.73891: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 19665 1727204181.74053: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000003e5 19665 1727204181.74077: variable 'ansible_search_path' from source: unknown 19665 1727204181.74085: variable 'ansible_search_path' from source: unknown 19665 1727204181.74132: calling self._execute() 19665 1727204181.74281: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204181.74293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204181.74309: variable 'omit' from source: magic vars 19665 1727204181.75082: variable 'ansible_distribution_major_version' from source: facts 19665 1727204181.75223: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204181.75588: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204181.76137: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204181.76212: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204181.76250: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204181.76287: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204181.76387: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204181.76426: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204181.76465: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204181.76499: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204181.76604: variable '__network_is_ostree' from source: set_fact 19665 1727204181.76622: Evaluated conditional (not __network_is_ostree is defined): False 19665 1727204181.76630: when evaluation is False, skipping this task 19665 1727204181.76636: _execute() done 19665 1727204181.76647: dumping result to json 19665 1727204181.76654: done dumping result, returning 19665 1727204181.76669: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [0affcd87-79f5-0dcc-3ea6-0000000003e5] 19665 1727204181.76680: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000003e5 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 19665 1727204181.76832: no more pending results, returning what we have 19665 1727204181.76837: results queue empty 19665 1727204181.76838: checking for any_errors_fatal 19665 1727204181.76848: done checking for any_errors_fatal 19665 1727204181.76849: checking for max_fail_percentage 19665 1727204181.76851: done checking for max_fail_percentage 19665 1727204181.76853: checking to see if all hosts have failed and the running result is not ok 19665 1727204181.76854: done checking to see if all hosts have failed 19665 1727204181.76854: getting the remaining hosts for this loop 19665 1727204181.76857: done getting the remaining hosts for this loop 19665 1727204181.76862: getting the next task for host managed-node3 19665 1727204181.76875: done getting next task for host managed-node3 19665 1727204181.76880: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 19665 1727204181.76883: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204181.76899: getting variables 19665 1727204181.76901: in VariableManager get_vars() 19665 1727204181.76949: Calling all_inventory to load vars for managed-node3 19665 1727204181.76953: Calling groups_inventory to load vars for managed-node3 19665 1727204181.76955: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204181.76969: Calling all_plugins_play to load vars for managed-node3 19665 1727204181.76973: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204181.76976: Calling groups_plugins_play to load vars for managed-node3 19665 1727204181.77982: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000003e5 19665 1727204181.77986: WORKER PROCESS EXITING 19665 1727204181.79626: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204181.82913: done with get_vars() 19665 1727204181.82952: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:56:21 -0400 (0:00:00.100) 0:00:32.697 ***** 19665 1727204181.83060: entering _queue_task() for managed-node3/service_facts 19665 1727204181.83417: worker is 1 (out of 1 available) 19665 1727204181.83430: exiting _queue_task() for managed-node3/service_facts 19665 1727204181.83447: done queuing things up, now waiting for results queue to drain 19665 1727204181.83449: waiting for pending results... 19665 1727204181.83758: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 19665 1727204181.83912: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000003e7 19665 1727204181.83933: variable 'ansible_search_path' from source: unknown 19665 1727204181.83944: variable 'ansible_search_path' from source: unknown 19665 1727204181.83987: calling self._execute() 19665 1727204181.84118: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204181.84129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204181.84145: variable 'omit' from source: magic vars 19665 1727204181.84514: variable 'ansible_distribution_major_version' from source: facts 19665 1727204181.84533: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204181.84551: variable 'omit' from source: magic vars 19665 1727204181.84610: variable 'omit' from source: magic vars 19665 1727204181.84656: variable 'omit' from source: magic vars 19665 1727204181.84696: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204181.84742: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204181.84774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204181.84793: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204181.84808: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204181.84842: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204181.84851: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204181.84861: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204181.85071: Set connection var ansible_connection to ssh 19665 1727204181.85108: Set connection var ansible_shell_type to sh 19665 1727204181.85157: Set connection var ansible_timeout to 10 19665 1727204181.85179: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204181.85254: Set connection var ansible_shell_executable to /bin/sh 19665 1727204181.85288: Set connection var ansible_pipelining to False 19665 1727204181.85368: variable 'ansible_shell_executable' from source: unknown 19665 1727204181.85403: variable 'ansible_connection' from source: unknown 19665 1727204181.85428: variable 'ansible_module_compression' from source: unknown 19665 1727204181.85436: variable 'ansible_shell_type' from source: unknown 19665 1727204181.85446: variable 'ansible_shell_executable' from source: unknown 19665 1727204181.85453: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204181.85462: variable 'ansible_pipelining' from source: unknown 19665 1727204181.85473: variable 'ansible_timeout' from source: unknown 19665 1727204181.85495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204181.85854: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204181.85984: variable 'omit' from source: magic vars 19665 1727204181.86000: starting attempt loop 19665 1727204181.86008: running the handler 19665 1727204181.86028: _low_level_execute_command(): starting 19665 1727204181.86042: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204181.87500: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204181.87518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204181.87538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204181.87577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204181.87630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204181.87658: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204181.87679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204181.87699: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204181.87712: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204181.87723: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204181.87736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204181.87755: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204181.87774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204181.87791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204181.87803: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204181.87816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204181.87901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204181.87926: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204181.87948: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204181.88037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204181.89646: stdout chunk (state=3): >>>/root <<< 19665 1727204181.89750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204181.89923: stderr chunk (state=3): >>><<< 19665 1727204181.89954: stdout chunk (state=3): >>><<< 19665 1727204181.90031: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204181.90171: _low_level_execute_command(): starting 19665 1727204181.90174: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204181.9012449-22117-165364033371825 `" && echo ansible-tmp-1727204181.9012449-22117-165364033371825="` echo /root/.ansible/tmp/ansible-tmp-1727204181.9012449-22117-165364033371825 `" ) && sleep 0' 19665 1727204181.91019: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204181.91042: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204181.91059: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204181.91086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204181.91133: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204181.91150: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204181.91167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204181.91185: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204181.91200: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204181.91216: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204181.91233: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204181.91243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204181.91260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204181.91276: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204181.91295: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204181.91310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204181.91400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204181.91426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204181.91442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204181.91545: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204181.93370: stdout chunk (state=3): >>>ansible-tmp-1727204181.9012449-22117-165364033371825=/root/.ansible/tmp/ansible-tmp-1727204181.9012449-22117-165364033371825 <<< 19665 1727204181.93571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204181.93575: stdout chunk (state=3): >>><<< 19665 1727204181.93577: stderr chunk (state=3): >>><<< 19665 1727204181.93673: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204181.9012449-22117-165364033371825=/root/.ansible/tmp/ansible-tmp-1727204181.9012449-22117-165364033371825 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204181.93677: variable 'ansible_module_compression' from source: unknown 19665 1727204181.93904: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 19665 1727204181.93907: variable 'ansible_facts' from source: unknown 19665 1727204181.93910: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204181.9012449-22117-165364033371825/AnsiballZ_service_facts.py 19665 1727204181.93975: Sending initial data 19665 1727204181.93978: Sent initial data (162 bytes) 19665 1727204181.94994: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204181.95023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204181.95045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204181.95071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204181.95124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204181.95137: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204181.95156: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204181.95178: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204181.95190: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204181.95201: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204181.95213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204181.95232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204181.95255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204181.95262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204181.95271: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204181.95303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204181.95362: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204181.95386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204181.95393: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204181.95480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204181.97160: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204181.97203: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204181.97247: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmplwo175dk /root/.ansible/tmp/ansible-tmp-1727204181.9012449-22117-165364033371825/AnsiballZ_service_facts.py <<< 19665 1727204181.97282: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204181.98503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204181.99116: stderr chunk (state=3): >>><<< 19665 1727204181.99119: stdout chunk (state=3): >>><<< 19665 1727204181.99122: done transferring module to remote 19665 1727204181.99128: _low_level_execute_command(): starting 19665 1727204181.99131: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204181.9012449-22117-165364033371825/ /root/.ansible/tmp/ansible-tmp-1727204181.9012449-22117-165364033371825/AnsiballZ_service_facts.py && sleep 0' 19665 1727204181.99829: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204181.99847: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204181.99862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204181.99889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204181.99938: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204181.99951: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204181.99965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204181.99983: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204181.99992: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204182.00001: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204182.00016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204182.00028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204182.00043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204182.00053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204182.00062: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204182.00075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204182.00158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204182.00182: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204182.00197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204182.00267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204182.02040: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204182.02145: stderr chunk (state=3): >>><<< 19665 1727204182.02157: stdout chunk (state=3): >>><<< 19665 1727204182.02276: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204182.02279: _low_level_execute_command(): starting 19665 1727204182.02282: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204181.9012449-22117-165364033371825/AnsiballZ_service_facts.py && sleep 0' 19665 1727204182.02921: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204182.02943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204182.02959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204182.02983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204182.03027: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204182.03049: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204182.03067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204182.03088: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204182.03101: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204182.03112: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204182.03124: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204182.03140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204182.03166: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204182.03182: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204182.03194: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204182.03207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204182.03294: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204182.03317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204182.03335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204182.03418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204183.37080: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", <<< 19665 1727204183.37098: stdout chunk (state=3): >>>"source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "stat<<< 19665 1727204183.37109: stdout chunk (state=3): >>>e": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "syst<<< 19665 1727204183.37113: stdout chunk (state=3): >>>emd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.s<<< 19665 1727204183.37141: stdout chunk (state=3): >>>ervice": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 19665 1727204183.38771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204183.38774: stdout chunk (state=3): >>><<< 19665 1727204183.38777: stderr chunk (state=3): >>><<< 19665 1727204183.38783: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles.service": {"name": "systemd-tmpfiles.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "yppasswdd.service": {"name": "yppasswdd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypserv.service": {"name": "ypserv.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ypxfrd.service": {"name": "ypxfrd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "cpupower.service": {"name": "cpupower.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "oddjobd.service": {"name": "oddjobd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rdisc.service": {"name": "rdisc.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate-resume@.service": {"name": "systemd-hibernate-resume@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "teamd@.service": {"name": "teamd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204183.39305: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204181.9012449-22117-165364033371825/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204183.39321: _low_level_execute_command(): starting 19665 1727204183.39331: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204181.9012449-22117-165364033371825/ > /dev/null 2>&1 && sleep 0' 19665 1727204183.39915: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204183.39927: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204183.39938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204183.39955: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204183.39994: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204183.40004: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204183.40015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204183.40031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204183.40043: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204183.40051: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204183.40060: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204183.40074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204183.40086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204183.40096: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204183.40108: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204183.40122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204183.40714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204183.40721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204183.40723: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204183.40767: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204183.42592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204183.42666: stderr chunk (state=3): >>><<< 19665 1727204183.42678: stdout chunk (state=3): >>><<< 19665 1727204183.42769: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204183.42773: handler run complete 19665 1727204183.42944: variable 'ansible_facts' from source: unknown 19665 1727204183.43327: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204183.43751: variable 'ansible_facts' from source: unknown 19665 1727204183.43880: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204183.44066: attempt loop complete, returning result 19665 1727204183.44078: _execute() done 19665 1727204183.44085: dumping result to json 19665 1727204183.44142: done dumping result, returning 19665 1727204183.44157: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [0affcd87-79f5-0dcc-3ea6-0000000003e7] 19665 1727204183.44170: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000003e7 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204183.49721: no more pending results, returning what we have 19665 1727204183.49724: results queue empty 19665 1727204183.49725: checking for any_errors_fatal 19665 1727204183.49728: done checking for any_errors_fatal 19665 1727204183.49729: checking for max_fail_percentage 19665 1727204183.49730: done checking for max_fail_percentage 19665 1727204183.49731: checking to see if all hosts have failed and the running result is not ok 19665 1727204183.49731: done checking to see if all hosts have failed 19665 1727204183.49732: getting the remaining hosts for this loop 19665 1727204183.49733: done getting the remaining hosts for this loop 19665 1727204183.49737: getting the next task for host managed-node3 19665 1727204183.49742: done getting next task for host managed-node3 19665 1727204183.49745: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 19665 1727204183.49748: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204183.49756: getting variables 19665 1727204183.49757: in VariableManager get_vars() 19665 1727204183.49787: Calling all_inventory to load vars for managed-node3 19665 1727204183.49789: Calling groups_inventory to load vars for managed-node3 19665 1727204183.49791: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204183.49800: Calling all_plugins_play to load vars for managed-node3 19665 1727204183.49802: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204183.49805: Calling groups_plugins_play to load vars for managed-node3 19665 1727204183.53841: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000003e7 19665 1727204183.53846: WORKER PROCESS EXITING 19665 1727204183.54347: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204183.55255: done with get_vars() 19665 1727204183.55279: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:56:23 -0400 (0:00:01.722) 0:00:34.420 ***** 19665 1727204183.55336: entering _queue_task() for managed-node3/package_facts 19665 1727204183.55576: worker is 1 (out of 1 available) 19665 1727204183.55590: exiting _queue_task() for managed-node3/package_facts 19665 1727204183.55601: done queuing things up, now waiting for results queue to drain 19665 1727204183.55603: waiting for pending results... 19665 1727204183.55935: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 19665 1727204183.56167: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000003e8 19665 1727204183.56182: variable 'ansible_search_path' from source: unknown 19665 1727204183.56196: variable 'ansible_search_path' from source: unknown 19665 1727204183.56252: calling self._execute() 19665 1727204183.56368: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204183.56372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204183.56385: variable 'omit' from source: magic vars 19665 1727204183.56790: variable 'ansible_distribution_major_version' from source: facts 19665 1727204183.56799: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204183.56805: variable 'omit' from source: magic vars 19665 1727204183.56849: variable 'omit' from source: magic vars 19665 1727204183.56910: variable 'omit' from source: magic vars 19665 1727204183.56992: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204183.57036: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204183.57075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204183.57107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204183.57133: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204183.57187: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204183.57199: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204183.57228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204183.57330: Set connection var ansible_connection to ssh 19665 1727204183.57361: Set connection var ansible_shell_type to sh 19665 1727204183.57366: Set connection var ansible_timeout to 10 19665 1727204183.57369: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204183.57383: Set connection var ansible_shell_executable to /bin/sh 19665 1727204183.57403: Set connection var ansible_pipelining to False 19665 1727204183.57442: variable 'ansible_shell_executable' from source: unknown 19665 1727204183.57452: variable 'ansible_connection' from source: unknown 19665 1727204183.57465: variable 'ansible_module_compression' from source: unknown 19665 1727204183.57474: variable 'ansible_shell_type' from source: unknown 19665 1727204183.57490: variable 'ansible_shell_executable' from source: unknown 19665 1727204183.57504: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204183.57518: variable 'ansible_pipelining' from source: unknown 19665 1727204183.57527: variable 'ansible_timeout' from source: unknown 19665 1727204183.57537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204183.57771: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204183.57814: variable 'omit' from source: magic vars 19665 1727204183.57831: starting attempt loop 19665 1727204183.57840: running the handler 19665 1727204183.57866: _low_level_execute_command(): starting 19665 1727204183.57886: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204183.58698: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204183.58723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204183.58743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204183.58769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204183.58813: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204183.58831: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204183.58858: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204183.58880: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204183.58894: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204183.58942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204183.58953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204183.59025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204183.59031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204183.59124: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204183.60758: stdout chunk (state=3): >>>/root <<< 19665 1727204183.60860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204183.60921: stderr chunk (state=3): >>><<< 19665 1727204183.60929: stdout chunk (state=3): >>><<< 19665 1727204183.60973: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204183.60978: _low_level_execute_command(): starting 19665 1727204183.60981: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204183.6095057-22323-275947260018218 `" && echo ansible-tmp-1727204183.6095057-22323-275947260018218="` echo /root/.ansible/tmp/ansible-tmp-1727204183.6095057-22323-275947260018218 `" ) && sleep 0' 19665 1727204183.61467: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204183.61471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204183.61506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204183.61510: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204183.61512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204183.61572: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204183.61575: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204183.61587: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204183.61623: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204183.63462: stdout chunk (state=3): >>>ansible-tmp-1727204183.6095057-22323-275947260018218=/root/.ansible/tmp/ansible-tmp-1727204183.6095057-22323-275947260018218 <<< 19665 1727204183.63586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204183.63682: stderr chunk (state=3): >>><<< 19665 1727204183.63703: stdout chunk (state=3): >>><<< 19665 1727204183.63729: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204183.6095057-22323-275947260018218=/root/.ansible/tmp/ansible-tmp-1727204183.6095057-22323-275947260018218 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204183.63779: variable 'ansible_module_compression' from source: unknown 19665 1727204183.63815: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 19665 1727204183.63863: variable 'ansible_facts' from source: unknown 19665 1727204183.64023: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204183.6095057-22323-275947260018218/AnsiballZ_package_facts.py 19665 1727204183.64213: Sending initial data 19665 1727204183.64222: Sent initial data (162 bytes) 19665 1727204183.65026: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204183.65040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204183.65062: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204183.65079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204183.65123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204183.65135: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204183.65193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204183.66991: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204183.67029: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204183.67087: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp9m5x4b_3 /root/.ansible/tmp/ansible-tmp-1727204183.6095057-22323-275947260018218/AnsiballZ_package_facts.py <<< 19665 1727204183.67101: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204183.70309: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204183.70554: stderr chunk (state=3): >>><<< 19665 1727204183.70558: stdout chunk (state=3): >>><<< 19665 1727204183.70561: done transferring module to remote 19665 1727204183.70573: _low_level_execute_command(): starting 19665 1727204183.70577: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204183.6095057-22323-275947260018218/ /root/.ansible/tmp/ansible-tmp-1727204183.6095057-22323-275947260018218/AnsiballZ_package_facts.py && sleep 0' 19665 1727204183.71231: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204183.71258: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204183.71285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204183.71305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204183.71350: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204183.71370: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204183.71386: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204183.71405: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204183.71418: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204183.71429: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204183.71445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204183.71461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204183.71488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204183.71502: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204183.71515: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204183.71530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204183.71623: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204183.71645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204183.71660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204183.71733: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204183.73615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204183.73620: stderr chunk (state=3): >>><<< 19665 1727204183.73622: stdout chunk (state=3): >>><<< 19665 1727204183.73671: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204183.73675: _low_level_execute_command(): starting 19665 1727204183.73677: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204183.6095057-22323-275947260018218/AnsiballZ_package_facts.py && sleep 0' 19665 1727204183.74300: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204183.74316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204183.74332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204183.74352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204183.74396: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204183.74413: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204183.74430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204183.74449: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204183.74463: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204183.74477: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204183.74489: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204183.74503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204183.74523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204183.74537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204183.74549: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204183.74562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204183.74646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204183.74666: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204183.74681: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204183.74768: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204184.21223: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"n<<< 19665 1727204184.21389: stdout chunk (state=3): >>>ame": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.<<< 19665 1727204184.21407: stdout chunk (state=3): >>>191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "sou<<< 19665 1727204184.21410: stdout chunk (state=3): >>>rce": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9"<<< 19665 1727204184.21427: stdout chunk (state=3): >>>, "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 19665 1727204184.22914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204184.23002: stderr chunk (state=3): >>><<< 19665 1727204184.23005: stdout chunk (state=3): >>><<< 19665 1727204184.23073: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240905", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools-wheel": [{"name": "python3-setuptools-wheel", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20210518", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.15.2", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.348", "release": "9.15.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "7.el9.1", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dejavu-sans-fonts": [{"name": "dejavu-sans-fonts", "version": "2.37", "release": "18.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-core-font-en": [{"name": "langpacks-core-font-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "9.0", "release": "26.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.13.7", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.16", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.1.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.18", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.34.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.4", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.9.13", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.24", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "0.9.10", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.4", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.2", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.42", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.0", "release": "13.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.3", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.40", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.8.0", "release": "7.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.9", "release": "9.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "38.20210216cvs.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.4", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.14", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.16.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.3", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.39", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.4.0", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.2", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.13", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre": [{"name": "pcre", "version": "8.44", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.6", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "8.32", "release": "36.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "91.4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "28", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.6", "release": "27.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-tools": [{"name": "dbus-tools", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "3.3.17", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.4", "release": "10.git1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3-cli": [{"name": "libnl3-cli", "version": "3.9.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libteam": [{"name": "libteam", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "acl": [{"name": "acl", "version": "2.3.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext": [{"name": "gettext", "version": "0.21", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "attr": [{"name": "attr", "version": "2.5.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.1.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.1.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.1", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.5.1", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.2", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.11", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.22.4", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.8", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-compat": [{"name": "libxcrypt-compat", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.9.19", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.18", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.15.0", "release": "9.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.1", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "234", "release": "19.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.2", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "590", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-rpm-macros": [{"name": "systemd-rpm-macros", "version": "252", "release": "47.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.19.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.13", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.7", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "49", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.0.9", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdaemon": [{"name": "libdaemon", "version": "0.14", "release": "23.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "teamd": [{"name": "teamd", "version": "1.31", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.4.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.4", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.1", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.37.4", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.12.20", "release": "8.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "28", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.5.7", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11", "release": "26.20190603git.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el9", "epoch": 9, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.8.7", "release": "32.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20210202", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.4.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.18.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob": [{"name": "oddjob", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oddjob-mkhomedir": [{"name": "oddjob-mkhomedir", "version": "0.34.7", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.4.0", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "3.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.43.0", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "53.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.1", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.27", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.6", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.4", "release": "13.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.35.2", "release": "54.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "3.2.3", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "252", "release": "47.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.77", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "7.76.1", "release": "31.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "63.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.24", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "38.1.45", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.29", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.68.4", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.2.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.13.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.68.0", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.40.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.63", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.3.3", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.14.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.69.0", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gpg": [{"name": "python3-gpg", "version": "1.15.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.6", "release": "1.el9.6", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.7.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.2", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.21", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.12.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "0.99.9", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog-logrotate": [{"name": "rsyslog-logrotate", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2310.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "10.el9", "epoch": 17, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.47", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.27", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.14.0", "release": "17.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "1.3.4", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "NetworkManager-team": [{"name": "NetworkManager-team", "version": "1.51.0", "release": "1.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.11.8", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "1.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240828", "release": "2.git626aa59.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.16.1.3", "release": "34.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "92.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "057", "release": "70.git20240819.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.5p2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "8.7p1", "release": "43.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.9.3", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.2.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-compat": [{"name": "authselect-compat", "version": "1.2.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.5", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.46.5", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.1.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.19.2", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.2", "release": "10.20210508.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwl100-firmware": [{"name": "iwl100-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl1000-firmware": [{"name": "iwl1000-firmware", "version": "39.31.5.1", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl105-firmware": [{"name": "iwl105-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl135-firmware": [{"name": "iwl135-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2000-firmware": [{"name": "iwl2000-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl2030-firmware": [{"name": "iwl2030-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl3160-firmware": [{"name": "iwl3160-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "iwl5000-firmware": [{"name": "iwl5000-firmware", "version": "8.83.5.1_1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl5150-firmware": [{"name": "iwl5150-firmware", "version": "8.24.2.2", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6000g2a-firmware": [{"name": "iwl6000g2a-firmware", "version": "18.168.6.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl6050-firmware": [{"name": "iwl6050-firmware", "version": "41.28.5.1", "release": "146.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwl7260-firmware": [{"name": "iwl7260-firmware", "version": "25.30.13.0", "release": "146.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "31.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "3228467c", "release": "613798eb", "epoch": null, "arch": null, "source": "rpm"}, {"name": "gpg-pubkey", "version": "8483c65d", "release": "5ccc5b19", "epoch": null, "arch": null, "source": "rpm"}], "epel-release": [{"name": "epel-release", "version": "9", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.3", "release": "8.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.2.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "57.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "21.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "35.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.7.4", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.101.0", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.75.0", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.19", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.80", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.174", "release": "462.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.13", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.09", "release": "3.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20200520", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.60.800", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.41", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.300", "release": "7.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2013.0523", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.073", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.66", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "1.94", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.21", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-subs": [{"name": "perl-subs", "version": "1.03", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.17", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.42", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.076", "release": "462.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.031", "release": "4.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.08", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.09", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "4.14", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.13", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.30", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.23", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.43", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.01", "release": "4.el9", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.30", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.85", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.12", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.56", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.21", "release": "460.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.31", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.238", "release": "460.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.52", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.50", "release": "460.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.74", "release": "461.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.15", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.78", "release": "461.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.08", "release": "462.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.32.1", "release": "481.el9", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "8.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.1.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers": [{"name": "glibc-headers", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "5.18", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.7.3", "release": "10.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.1", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.11", "release": "41.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.34", "release": "125.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.18", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "11.5.0", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.2.2", "release": "6.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "5.14.0", "release": "511.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.2.5", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "38", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "4.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.1", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.3", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.6", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.1", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.37", "release": "481.el9", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "7.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.47", "release": "481.el9", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "11.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.34", "release": "9.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.6.5", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "29.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "emacs-filesystem": [{"name": "emacs-filesystem", "version": "27.2", "release": "10.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.43.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.3.0", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "18.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.34", "release": "7.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "4.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "5.4.1", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "6.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "3.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.8.4", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.5.4", "release": "27.el9", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "14.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "8.2.2637", "release": "21.el9", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.2.3", "release": "20.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "53.0.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.5.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "2.10", "release": "7.el9.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pytz": [{"name": "python3-pytz", "version": "2021.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-babel": [{"name": "python3-babel", "version": "2.9.1", "release": "2.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.4", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyrsistent": [{"name": "python3-pyrsistent", "version": "0.17.3", "release": "8.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-prettytable": [{"name": "python3-prettytable", "version": "0.7.2", "release": "27.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.1.1", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.10.6", "release": "15.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "1.1.1", "release": "12.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "2.11.3", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.6", "release": "2.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.0", "release": "4.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.21", "release": "16.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.6", "release": "25.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "20.3.0", "release": "7.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "3.2.0", "release": "13.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.5.2", "release": "4.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "geolite2-country": [{"name": "geolite2-country", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "geolite2-city": [{"name": "geolite2-city", "version": "20191217", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.0", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.7", "release": "5.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.6", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.6", "release": "2.1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "12.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.5", "release": "6.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-chardet": [{"name": "python3-chardet", "version": "4.0.0", "release": "5.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.25.1", "release": "8.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.2", "release": "19.b1.el9", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.4", "release": "19.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "7.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "21.3.1", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.85", "release": "16.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204184.26835: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204183.6095057-22323-275947260018218/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204184.26868: _low_level_execute_command(): starting 19665 1727204184.26877: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204183.6095057-22323-275947260018218/ > /dev/null 2>&1 && sleep 0' 19665 1727204184.27494: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204184.27509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204184.27582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204184.27599: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204184.27645: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204184.27658: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204184.27680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204184.27700: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204184.27712: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204184.27723: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204184.27732: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204184.27779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204184.27793: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204184.27802: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204184.27811: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204184.27827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204184.27900: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204184.27930: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204184.27949: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204184.28044: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204184.29919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204184.29923: stdout chunk (state=3): >>><<< 19665 1727204184.29925: stderr chunk (state=3): >>><<< 19665 1727204184.30568: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204184.30572: handler run complete 19665 1727204184.30860: variable 'ansible_facts' from source: unknown 19665 1727204184.31367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204184.33489: variable 'ansible_facts' from source: unknown 19665 1727204184.33968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204184.34758: attempt loop complete, returning result 19665 1727204184.34782: _execute() done 19665 1727204184.34791: dumping result to json 19665 1727204184.35032: done dumping result, returning 19665 1727204184.35047: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [0affcd87-79f5-0dcc-3ea6-0000000003e8] 19665 1727204184.35058: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000003e8 19665 1727204184.37411: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000003e8 19665 1727204184.37415: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204184.37557: no more pending results, returning what we have 19665 1727204184.37561: results queue empty 19665 1727204184.37562: checking for any_errors_fatal 19665 1727204184.37571: done checking for any_errors_fatal 19665 1727204184.37572: checking for max_fail_percentage 19665 1727204184.37573: done checking for max_fail_percentage 19665 1727204184.37574: checking to see if all hosts have failed and the running result is not ok 19665 1727204184.37575: done checking to see if all hosts have failed 19665 1727204184.37575: getting the remaining hosts for this loop 19665 1727204184.37577: done getting the remaining hosts for this loop 19665 1727204184.37580: getting the next task for host managed-node3 19665 1727204184.37586: done getting next task for host managed-node3 19665 1727204184.37590: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 19665 1727204184.37592: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204184.37601: getting variables 19665 1727204184.37602: in VariableManager get_vars() 19665 1727204184.37634: Calling all_inventory to load vars for managed-node3 19665 1727204184.37637: Calling groups_inventory to load vars for managed-node3 19665 1727204184.37641: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204184.37651: Calling all_plugins_play to load vars for managed-node3 19665 1727204184.37653: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204184.37656: Calling groups_plugins_play to load vars for managed-node3 19665 1727204184.39023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204184.40849: done with get_vars() 19665 1727204184.40877: done getting variables 19665 1727204184.40946: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.856) 0:00:35.276 ***** 19665 1727204184.40985: entering _queue_task() for managed-node3/debug 19665 1727204184.41341: worker is 1 (out of 1 available) 19665 1727204184.41353: exiting _queue_task() for managed-node3/debug 19665 1727204184.41367: done queuing things up, now waiting for results queue to drain 19665 1727204184.41368: waiting for pending results... 19665 1727204184.41661: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 19665 1727204184.41791: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000005b 19665 1727204184.41815: variable 'ansible_search_path' from source: unknown 19665 1727204184.41822: variable 'ansible_search_path' from source: unknown 19665 1727204184.41867: calling self._execute() 19665 1727204184.41977: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204184.41989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204184.42012: variable 'omit' from source: magic vars 19665 1727204184.42458: variable 'ansible_distribution_major_version' from source: facts 19665 1727204184.42478: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204184.42488: variable 'omit' from source: magic vars 19665 1727204184.42532: variable 'omit' from source: magic vars 19665 1727204184.42650: variable 'network_provider' from source: set_fact 19665 1727204184.42680: variable 'omit' from source: magic vars 19665 1727204184.42729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204184.42784: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204184.42811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204184.42833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204184.42853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204184.42895: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204184.42904: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204184.42911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204184.43023: Set connection var ansible_connection to ssh 19665 1727204184.43035: Set connection var ansible_shell_type to sh 19665 1727204184.43048: Set connection var ansible_timeout to 10 19665 1727204184.43057: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204184.43071: Set connection var ansible_shell_executable to /bin/sh 19665 1727204184.43082: Set connection var ansible_pipelining to False 19665 1727204184.43116: variable 'ansible_shell_executable' from source: unknown 19665 1727204184.43124: variable 'ansible_connection' from source: unknown 19665 1727204184.43131: variable 'ansible_module_compression' from source: unknown 19665 1727204184.43137: variable 'ansible_shell_type' from source: unknown 19665 1727204184.43146: variable 'ansible_shell_executable' from source: unknown 19665 1727204184.43154: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204184.43161: variable 'ansible_pipelining' from source: unknown 19665 1727204184.43170: variable 'ansible_timeout' from source: unknown 19665 1727204184.43177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204184.43332: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204184.43351: variable 'omit' from source: magic vars 19665 1727204184.43360: starting attempt loop 19665 1727204184.43369: running the handler 19665 1727204184.43417: handler run complete 19665 1727204184.43444: attempt loop complete, returning result 19665 1727204184.43451: _execute() done 19665 1727204184.43458: dumping result to json 19665 1727204184.43466: done dumping result, returning 19665 1727204184.43476: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [0affcd87-79f5-0dcc-3ea6-00000000005b] 19665 1727204184.43485: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000005b ok: [managed-node3] => {} MSG: Using network provider: nm 19665 1727204184.43654: no more pending results, returning what we have 19665 1727204184.43658: results queue empty 19665 1727204184.43659: checking for any_errors_fatal 19665 1727204184.43672: done checking for any_errors_fatal 19665 1727204184.43673: checking for max_fail_percentage 19665 1727204184.43676: done checking for max_fail_percentage 19665 1727204184.43677: checking to see if all hosts have failed and the running result is not ok 19665 1727204184.43678: done checking to see if all hosts have failed 19665 1727204184.43679: getting the remaining hosts for this loop 19665 1727204184.43681: done getting the remaining hosts for this loop 19665 1727204184.43685: getting the next task for host managed-node3 19665 1727204184.43693: done getting next task for host managed-node3 19665 1727204184.43697: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19665 1727204184.43699: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204184.43709: getting variables 19665 1727204184.43711: in VariableManager get_vars() 19665 1727204184.43749: Calling all_inventory to load vars for managed-node3 19665 1727204184.43752: Calling groups_inventory to load vars for managed-node3 19665 1727204184.43754: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204184.43767: Calling all_plugins_play to load vars for managed-node3 19665 1727204184.43770: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204184.43773: Calling groups_plugins_play to load vars for managed-node3 19665 1727204184.44814: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000005b 19665 1727204184.44818: WORKER PROCESS EXITING 19665 1727204184.46438: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204184.48231: done with get_vars() 19665 1727204184.48260: done getting variables 19665 1727204184.48380: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.074) 0:00:35.350 ***** 19665 1727204184.48411: entering _queue_task() for managed-node3/fail 19665 1727204184.48867: worker is 1 (out of 1 available) 19665 1727204184.48880: exiting _queue_task() for managed-node3/fail 19665 1727204184.48892: done queuing things up, now waiting for results queue to drain 19665 1727204184.48893: waiting for pending results... 19665 1727204184.49259: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 19665 1727204184.49394: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000005c 19665 1727204184.49422: variable 'ansible_search_path' from source: unknown 19665 1727204184.49430: variable 'ansible_search_path' from source: unknown 19665 1727204184.49480: calling self._execute() 19665 1727204184.49601: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204184.49614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204184.49639: variable 'omit' from source: magic vars 19665 1727204184.50071: variable 'ansible_distribution_major_version' from source: facts 19665 1727204184.50100: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204184.50234: variable 'network_state' from source: role '' defaults 19665 1727204184.50253: Evaluated conditional (network_state != {}): False 19665 1727204184.50262: when evaluation is False, skipping this task 19665 1727204184.50271: _execute() done 19665 1727204184.50279: dumping result to json 19665 1727204184.50292: done dumping result, returning 19665 1727204184.50310: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [0affcd87-79f5-0dcc-3ea6-00000000005c] 19665 1727204184.50323: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000005c skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204184.50557: no more pending results, returning what we have 19665 1727204184.50562: results queue empty 19665 1727204184.50566: checking for any_errors_fatal 19665 1727204184.50577: done checking for any_errors_fatal 19665 1727204184.50578: checking for max_fail_percentage 19665 1727204184.50580: done checking for max_fail_percentage 19665 1727204184.50581: checking to see if all hosts have failed and the running result is not ok 19665 1727204184.50582: done checking to see if all hosts have failed 19665 1727204184.50583: getting the remaining hosts for this loop 19665 1727204184.50586: done getting the remaining hosts for this loop 19665 1727204184.50590: getting the next task for host managed-node3 19665 1727204184.50598: done getting next task for host managed-node3 19665 1727204184.50604: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19665 1727204184.50607: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204184.50623: getting variables 19665 1727204184.50625: in VariableManager get_vars() 19665 1727204184.50668: Calling all_inventory to load vars for managed-node3 19665 1727204184.50672: Calling groups_inventory to load vars for managed-node3 19665 1727204184.50675: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204184.50688: Calling all_plugins_play to load vars for managed-node3 19665 1727204184.50691: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204184.50694: Calling groups_plugins_play to load vars for managed-node3 19665 1727204184.51720: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000005c 19665 1727204184.51724: WORKER PROCESS EXITING 19665 1727204184.52550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204184.54545: done with get_vars() 19665 1727204184.54577: done getting variables 19665 1727204184.54653: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.062) 0:00:35.413 ***** 19665 1727204184.54704: entering _queue_task() for managed-node3/fail 19665 1727204184.55106: worker is 1 (out of 1 available) 19665 1727204184.55121: exiting _queue_task() for managed-node3/fail 19665 1727204184.55134: done queuing things up, now waiting for results queue to drain 19665 1727204184.55136: waiting for pending results... 19665 1727204184.55326: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 19665 1727204184.55406: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000005d 19665 1727204184.55416: variable 'ansible_search_path' from source: unknown 19665 1727204184.55420: variable 'ansible_search_path' from source: unknown 19665 1727204184.55453: calling self._execute() 19665 1727204184.55530: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204184.55533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204184.55545: variable 'omit' from source: magic vars 19665 1727204184.55858: variable 'ansible_distribution_major_version' from source: facts 19665 1727204184.55871: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204184.55959: variable 'network_state' from source: role '' defaults 19665 1727204184.55970: Evaluated conditional (network_state != {}): False 19665 1727204184.55973: when evaluation is False, skipping this task 19665 1727204184.55976: _execute() done 19665 1727204184.55978: dumping result to json 19665 1727204184.55981: done dumping result, returning 19665 1727204184.55988: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [0affcd87-79f5-0dcc-3ea6-00000000005d] 19665 1727204184.55994: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000005d 19665 1727204184.56080: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000005d 19665 1727204184.56083: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204184.56128: no more pending results, returning what we have 19665 1727204184.56132: results queue empty 19665 1727204184.56133: checking for any_errors_fatal 19665 1727204184.56143: done checking for any_errors_fatal 19665 1727204184.56144: checking for max_fail_percentage 19665 1727204184.56146: done checking for max_fail_percentage 19665 1727204184.56147: checking to see if all hosts have failed and the running result is not ok 19665 1727204184.56147: done checking to see if all hosts have failed 19665 1727204184.56148: getting the remaining hosts for this loop 19665 1727204184.56150: done getting the remaining hosts for this loop 19665 1727204184.56154: getting the next task for host managed-node3 19665 1727204184.56159: done getting next task for host managed-node3 19665 1727204184.56165: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19665 1727204184.56168: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204184.56183: getting variables 19665 1727204184.56185: in VariableManager get_vars() 19665 1727204184.56224: Calling all_inventory to load vars for managed-node3 19665 1727204184.56227: Calling groups_inventory to load vars for managed-node3 19665 1727204184.56229: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204184.56238: Calling all_plugins_play to load vars for managed-node3 19665 1727204184.56241: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204184.56243: Calling groups_plugins_play to load vars for managed-node3 19665 1727204184.57544: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204184.59146: done with get_vars() 19665 1727204184.59173: done getting variables 19665 1727204184.59229: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.045) 0:00:35.459 ***** 19665 1727204184.59262: entering _queue_task() for managed-node3/fail 19665 1727204184.59575: worker is 1 (out of 1 available) 19665 1727204184.59594: exiting _queue_task() for managed-node3/fail 19665 1727204184.59608: done queuing things up, now waiting for results queue to drain 19665 1727204184.59610: waiting for pending results... 19665 1727204184.59907: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 19665 1727204184.60034: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000005e 19665 1727204184.60060: variable 'ansible_search_path' from source: unknown 19665 1727204184.60071: variable 'ansible_search_path' from source: unknown 19665 1727204184.60113: calling self._execute() 19665 1727204184.60222: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204184.60234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204184.60249: variable 'omit' from source: magic vars 19665 1727204184.60641: variable 'ansible_distribution_major_version' from source: facts 19665 1727204184.60661: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204184.60852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204184.63342: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204184.63425: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204184.63470: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204184.63507: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204184.63540: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204184.63623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204184.63674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204184.63704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204184.63752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204184.63774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204184.63882: variable 'ansible_distribution_major_version' from source: facts 19665 1727204184.63902: Evaluated conditional (ansible_distribution_major_version | int > 9): False 19665 1727204184.63910: when evaluation is False, skipping this task 19665 1727204184.63917: _execute() done 19665 1727204184.63923: dumping result to json 19665 1727204184.63930: done dumping result, returning 19665 1727204184.63941: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [0affcd87-79f5-0dcc-3ea6-00000000005e] 19665 1727204184.63950: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000005e skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int > 9", "skip_reason": "Conditional result was False" } 19665 1727204184.64110: no more pending results, returning what we have 19665 1727204184.64115: results queue empty 19665 1727204184.64116: checking for any_errors_fatal 19665 1727204184.64123: done checking for any_errors_fatal 19665 1727204184.64124: checking for max_fail_percentage 19665 1727204184.64126: done checking for max_fail_percentage 19665 1727204184.64126: checking to see if all hosts have failed and the running result is not ok 19665 1727204184.64127: done checking to see if all hosts have failed 19665 1727204184.64128: getting the remaining hosts for this loop 19665 1727204184.64130: done getting the remaining hosts for this loop 19665 1727204184.64134: getting the next task for host managed-node3 19665 1727204184.64141: done getting next task for host managed-node3 19665 1727204184.64145: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19665 1727204184.64147: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204184.64160: getting variables 19665 1727204184.64162: in VariableManager get_vars() 19665 1727204184.64204: Calling all_inventory to load vars for managed-node3 19665 1727204184.64206: Calling groups_inventory to load vars for managed-node3 19665 1727204184.64209: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204184.64220: Calling all_plugins_play to load vars for managed-node3 19665 1727204184.64223: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204184.64226: Calling groups_plugins_play to load vars for managed-node3 19665 1727204184.65307: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000005e 19665 1727204184.65311: WORKER PROCESS EXITING 19665 1727204184.65982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204184.67725: done with get_vars() 19665 1727204184.67749: done getting variables 19665 1727204184.67808: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.085) 0:00:35.545 ***** 19665 1727204184.67839: entering _queue_task() for managed-node3/dnf 19665 1727204184.68171: worker is 1 (out of 1 available) 19665 1727204184.68184: exiting _queue_task() for managed-node3/dnf 19665 1727204184.68196: done queuing things up, now waiting for results queue to drain 19665 1727204184.68198: waiting for pending results... 19665 1727204184.68473: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 19665 1727204184.68609: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000005f 19665 1727204184.68625: variable 'ansible_search_path' from source: unknown 19665 1727204184.68631: variable 'ansible_search_path' from source: unknown 19665 1727204184.68677: calling self._execute() 19665 1727204184.68782: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204184.68794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204184.68809: variable 'omit' from source: magic vars 19665 1727204184.69194: variable 'ansible_distribution_major_version' from source: facts 19665 1727204184.69215: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204184.69421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204184.71850: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204184.71928: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204184.71973: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204184.72017: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204184.72049: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204184.72138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204184.72189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204184.72225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204184.72275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204184.72296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204184.72421: variable 'ansible_distribution' from source: facts 19665 1727204184.72435: variable 'ansible_distribution_major_version' from source: facts 19665 1727204184.72454: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 19665 1727204184.72583: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204184.72718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204184.72747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204184.72784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204184.72830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204184.72850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204184.72902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204184.72930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204184.72960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204184.73010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204184.73029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204184.73075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204184.73104: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204184.73132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204184.73176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204184.73198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204184.73355: variable 'network_connections' from source: play vars 19665 1727204184.73374: variable 'profile' from source: play vars 19665 1727204184.73450: variable 'profile' from source: play vars 19665 1727204184.73459: variable 'interface' from source: set_fact 19665 1727204184.73526: variable 'interface' from source: set_fact 19665 1727204184.73604: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204184.73792: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204184.73835: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204184.73876: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204184.73910: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204184.73962: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204184.73993: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204184.74035: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204184.74074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204184.74126: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204184.74396: variable 'network_connections' from source: play vars 19665 1727204184.74406: variable 'profile' from source: play vars 19665 1727204184.74474: variable 'profile' from source: play vars 19665 1727204184.74485: variable 'interface' from source: set_fact 19665 1727204184.74550: variable 'interface' from source: set_fact 19665 1727204184.74582: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19665 1727204184.74591: when evaluation is False, skipping this task 19665 1727204184.74602: _execute() done 19665 1727204184.74608: dumping result to json 19665 1727204184.74615: done dumping result, returning 19665 1727204184.74627: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [0affcd87-79f5-0dcc-3ea6-00000000005f] 19665 1727204184.74638: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000005f skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19665 1727204184.74794: no more pending results, returning what we have 19665 1727204184.74798: results queue empty 19665 1727204184.74800: checking for any_errors_fatal 19665 1727204184.74806: done checking for any_errors_fatal 19665 1727204184.74807: checking for max_fail_percentage 19665 1727204184.74809: done checking for max_fail_percentage 19665 1727204184.74810: checking to see if all hosts have failed and the running result is not ok 19665 1727204184.74811: done checking to see if all hosts have failed 19665 1727204184.74812: getting the remaining hosts for this loop 19665 1727204184.74814: done getting the remaining hosts for this loop 19665 1727204184.74819: getting the next task for host managed-node3 19665 1727204184.74826: done getting next task for host managed-node3 19665 1727204184.74831: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19665 1727204184.74833: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204184.74848: getting variables 19665 1727204184.74850: in VariableManager get_vars() 19665 1727204184.74892: Calling all_inventory to load vars for managed-node3 19665 1727204184.74895: Calling groups_inventory to load vars for managed-node3 19665 1727204184.74897: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204184.74908: Calling all_plugins_play to load vars for managed-node3 19665 1727204184.74911: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204184.74914: Calling groups_plugins_play to load vars for managed-node3 19665 1727204184.75885: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000005f 19665 1727204184.75888: WORKER PROCESS EXITING 19665 1727204184.76630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204184.78252: done with get_vars() 19665 1727204184.78288: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 19665 1727204184.78345: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.105) 0:00:35.650 ***** 19665 1727204184.78376: entering _queue_task() for managed-node3/yum 19665 1727204184.78624: worker is 1 (out of 1 available) 19665 1727204184.78638: exiting _queue_task() for managed-node3/yum 19665 1727204184.78651: done queuing things up, now waiting for results queue to drain 19665 1727204184.78653: waiting for pending results... 19665 1727204184.78848: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 19665 1727204184.78940: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000060 19665 1727204184.78956: variable 'ansible_search_path' from source: unknown 19665 1727204184.78960: variable 'ansible_search_path' from source: unknown 19665 1727204184.78997: calling self._execute() 19665 1727204184.79082: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204184.79086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204184.79096: variable 'omit' from source: magic vars 19665 1727204184.79390: variable 'ansible_distribution_major_version' from source: facts 19665 1727204184.79402: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204184.79527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204184.81637: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204184.81687: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204184.81716: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204184.81750: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204184.81771: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204184.81841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204184.81877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204184.81897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204184.81924: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204184.81944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204184.82019: variable 'ansible_distribution_major_version' from source: facts 19665 1727204184.82033: Evaluated conditional (ansible_distribution_major_version | int < 8): False 19665 1727204184.82040: when evaluation is False, skipping this task 19665 1727204184.82047: _execute() done 19665 1727204184.82050: dumping result to json 19665 1727204184.82053: done dumping result, returning 19665 1727204184.82061: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [0affcd87-79f5-0dcc-3ea6-000000000060] 19665 1727204184.82069: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000060 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 19665 1727204184.82209: no more pending results, returning what we have 19665 1727204184.82213: results queue empty 19665 1727204184.82214: checking for any_errors_fatal 19665 1727204184.82224: done checking for any_errors_fatal 19665 1727204184.82225: checking for max_fail_percentage 19665 1727204184.82226: done checking for max_fail_percentage 19665 1727204184.82227: checking to see if all hosts have failed and the running result is not ok 19665 1727204184.82228: done checking to see if all hosts have failed 19665 1727204184.82229: getting the remaining hosts for this loop 19665 1727204184.82230: done getting the remaining hosts for this loop 19665 1727204184.82235: getting the next task for host managed-node3 19665 1727204184.82241: done getting next task for host managed-node3 19665 1727204184.82245: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19665 1727204184.82246: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204184.82259: getting variables 19665 1727204184.82261: in VariableManager get_vars() 19665 1727204184.82304: Calling all_inventory to load vars for managed-node3 19665 1727204184.82307: Calling groups_inventory to load vars for managed-node3 19665 1727204184.82309: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204184.82320: Calling all_plugins_play to load vars for managed-node3 19665 1727204184.82323: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204184.82325: Calling groups_plugins_play to load vars for managed-node3 19665 1727204184.82961: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000060 19665 1727204184.82967: WORKER PROCESS EXITING 19665 1727204184.83389: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204184.84347: done with get_vars() 19665 1727204184.84370: done getting variables 19665 1727204184.84454: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:56:24 -0400 (0:00:00.061) 0:00:35.711 ***** 19665 1727204184.84529: entering _queue_task() for managed-node3/fail 19665 1727204184.84787: worker is 1 (out of 1 available) 19665 1727204184.84801: exiting _queue_task() for managed-node3/fail 19665 1727204184.84814: done queuing things up, now waiting for results queue to drain 19665 1727204184.84815: waiting for pending results... 19665 1727204184.85357: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 19665 1727204184.85970: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000061 19665 1727204184.85992: variable 'ansible_search_path' from source: unknown 19665 1727204184.86004: variable 'ansible_search_path' from source: unknown 19665 1727204184.86067: calling self._execute() 19665 1727204184.86310: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204184.86329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204184.86349: variable 'omit' from source: magic vars 19665 1727204184.87519: variable 'ansible_distribution_major_version' from source: facts 19665 1727204184.87538: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204184.87686: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204184.87928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204184.92107: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204184.92197: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204184.92251: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204184.92298: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204184.92329: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204184.92429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204184.92487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204184.92523: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204184.92585: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204184.92605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204184.92652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204184.92692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204184.92720: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204184.92773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204184.92802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204184.92847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204184.92877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204184.92919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204184.92961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204184.92981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204184.93194: variable 'network_connections' from source: play vars 19665 1727204184.93211: variable 'profile' from source: play vars 19665 1727204184.93398: variable 'profile' from source: play vars 19665 1727204184.93408: variable 'interface' from source: set_fact 19665 1727204184.93484: variable 'interface' from source: set_fact 19665 1727204184.93567: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204184.93758: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204184.93823: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204184.93857: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204184.93977: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204184.94123: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204184.94152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204184.94253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204184.94290: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204184.94391: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204184.95873: variable 'network_connections' from source: play vars 19665 1727204184.95956: variable 'profile' from source: play vars 19665 1727204184.96033: variable 'profile' from source: play vars 19665 1727204184.96171: variable 'interface' from source: set_fact 19665 1727204184.96266: variable 'interface' from source: set_fact 19665 1727204184.96305: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19665 1727204184.96390: when evaluation is False, skipping this task 19665 1727204184.96398: _execute() done 19665 1727204184.96405: dumping result to json 19665 1727204184.96413: done dumping result, returning 19665 1727204184.96428: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0dcc-3ea6-000000000061] 19665 1727204184.96446: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000061 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19665 1727204184.96610: no more pending results, returning what we have 19665 1727204184.96615: results queue empty 19665 1727204184.96616: checking for any_errors_fatal 19665 1727204184.96623: done checking for any_errors_fatal 19665 1727204184.96623: checking for max_fail_percentage 19665 1727204184.96626: done checking for max_fail_percentage 19665 1727204184.96626: checking to see if all hosts have failed and the running result is not ok 19665 1727204184.96627: done checking to see if all hosts have failed 19665 1727204184.96628: getting the remaining hosts for this loop 19665 1727204184.96630: done getting the remaining hosts for this loop 19665 1727204184.96634: getting the next task for host managed-node3 19665 1727204184.96641: done getting next task for host managed-node3 19665 1727204184.96645: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 19665 1727204184.96647: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204184.96664: getting variables 19665 1727204184.96666: in VariableManager get_vars() 19665 1727204184.96725: Calling all_inventory to load vars for managed-node3 19665 1727204184.96728: Calling groups_inventory to load vars for managed-node3 19665 1727204184.96731: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204184.96742: Calling all_plugins_play to load vars for managed-node3 19665 1727204184.96746: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204184.96752: Calling groups_plugins_play to load vars for managed-node3 19665 1727204184.98128: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000061 19665 1727204184.98133: WORKER PROCESS EXITING 19665 1727204184.99291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204185.00961: done with get_vars() 19665 1727204185.00995: done getting variables 19665 1727204185.01059: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.165) 0:00:35.877 ***** 19665 1727204185.01094: entering _queue_task() for managed-node3/package 19665 1727204185.01423: worker is 1 (out of 1 available) 19665 1727204185.01435: exiting _queue_task() for managed-node3/package 19665 1727204185.01450: done queuing things up, now waiting for results queue to drain 19665 1727204185.01452: waiting for pending results... 19665 1727204185.01847: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 19665 1727204185.01981: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000062 19665 1727204185.02003: variable 'ansible_search_path' from source: unknown 19665 1727204185.02016: variable 'ansible_search_path' from source: unknown 19665 1727204185.02063: calling self._execute() 19665 1727204185.02179: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204185.02191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204185.02206: variable 'omit' from source: magic vars 19665 1727204185.02750: variable 'ansible_distribution_major_version' from source: facts 19665 1727204185.02773: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204185.03102: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204185.03576: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204185.03631: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204185.03674: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204185.03768: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204185.03894: variable 'network_packages' from source: role '' defaults 19665 1727204185.04016: variable '__network_provider_setup' from source: role '' defaults 19665 1727204185.04036: variable '__network_service_name_default_nm' from source: role '' defaults 19665 1727204185.04111: variable '__network_service_name_default_nm' from source: role '' defaults 19665 1727204185.04125: variable '__network_packages_default_nm' from source: role '' defaults 19665 1727204185.04196: variable '__network_packages_default_nm' from source: role '' defaults 19665 1727204185.04404: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204185.07058: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204185.07134: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204185.07187: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204185.07227: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204185.07261: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204185.07355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204185.07394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204185.07429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.07480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204185.07501: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204185.07556: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204185.07589: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204185.07623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.07674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204185.07693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204185.07956: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19665 1727204185.08089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204185.08117: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204185.08150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.08199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204185.08219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204185.08321: variable 'ansible_python' from source: facts 19665 1727204185.08358: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19665 1727204185.08459: variable '__network_wpa_supplicant_required' from source: role '' defaults 19665 1727204185.08557: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19665 1727204185.08694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204185.08727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204185.08759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.08806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204185.08830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204185.08885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204185.08926: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204185.08959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.09005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204185.09027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204185.09194: variable 'network_connections' from source: play vars 19665 1727204185.09205: variable 'profile' from source: play vars 19665 1727204185.09324: variable 'profile' from source: play vars 19665 1727204185.09336: variable 'interface' from source: set_fact 19665 1727204185.09443: variable 'interface' from source: set_fact 19665 1727204185.09524: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204185.09546: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204185.09570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.09595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204185.09639: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204185.09832: variable 'network_connections' from source: play vars 19665 1727204185.09835: variable 'profile' from source: play vars 19665 1727204185.09914: variable 'profile' from source: play vars 19665 1727204185.09922: variable 'interface' from source: set_fact 19665 1727204185.09973: variable 'interface' from source: set_fact 19665 1727204185.09999: variable '__network_packages_default_wireless' from source: role '' defaults 19665 1727204185.10077: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204185.10285: variable 'network_connections' from source: play vars 19665 1727204185.10288: variable 'profile' from source: play vars 19665 1727204185.10335: variable 'profile' from source: play vars 19665 1727204185.10338: variable 'interface' from source: set_fact 19665 1727204185.10414: variable 'interface' from source: set_fact 19665 1727204185.10434: variable '__network_packages_default_team' from source: role '' defaults 19665 1727204185.10495: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204185.10699: variable 'network_connections' from source: play vars 19665 1727204185.10702: variable 'profile' from source: play vars 19665 1727204185.10750: variable 'profile' from source: play vars 19665 1727204185.10753: variable 'interface' from source: set_fact 19665 1727204185.10828: variable 'interface' from source: set_fact 19665 1727204185.10871: variable '__network_service_name_default_initscripts' from source: role '' defaults 19665 1727204185.10916: variable '__network_service_name_default_initscripts' from source: role '' defaults 19665 1727204185.10922: variable '__network_packages_default_initscripts' from source: role '' defaults 19665 1727204185.10967: variable '__network_packages_default_initscripts' from source: role '' defaults 19665 1727204185.11174: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19665 1727204185.11751: variable 'network_connections' from source: play vars 19665 1727204185.11763: variable 'profile' from source: play vars 19665 1727204185.11835: variable 'profile' from source: play vars 19665 1727204185.11843: variable 'interface' from source: set_fact 19665 1727204185.11923: variable 'interface' from source: set_fact 19665 1727204185.11937: variable 'ansible_distribution' from source: facts 19665 1727204185.11945: variable '__network_rh_distros' from source: role '' defaults 19665 1727204185.11955: variable 'ansible_distribution_major_version' from source: facts 19665 1727204185.11979: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19665 1727204185.12309: variable 'ansible_distribution' from source: facts 19665 1727204185.12313: variable '__network_rh_distros' from source: role '' defaults 19665 1727204185.12317: variable 'ansible_distribution_major_version' from source: facts 19665 1727204185.12334: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19665 1727204185.12479: variable 'ansible_distribution' from source: facts 19665 1727204185.12482: variable '__network_rh_distros' from source: role '' defaults 19665 1727204185.12485: variable 'ansible_distribution_major_version' from source: facts 19665 1727204185.12508: variable 'network_provider' from source: set_fact 19665 1727204185.12520: variable 'ansible_facts' from source: unknown 19665 1727204185.12952: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 19665 1727204185.12955: when evaluation is False, skipping this task 19665 1727204185.12958: _execute() done 19665 1727204185.12961: dumping result to json 19665 1727204185.12963: done dumping result, returning 19665 1727204185.12971: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [0affcd87-79f5-0dcc-3ea6-000000000062] 19665 1727204185.12976: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000062 19665 1727204185.13067: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000062 19665 1727204185.13070: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 19665 1727204185.13121: no more pending results, returning what we have 19665 1727204185.13125: results queue empty 19665 1727204185.13126: checking for any_errors_fatal 19665 1727204185.13133: done checking for any_errors_fatal 19665 1727204185.13133: checking for max_fail_percentage 19665 1727204185.13135: done checking for max_fail_percentage 19665 1727204185.13136: checking to see if all hosts have failed and the running result is not ok 19665 1727204185.13137: done checking to see if all hosts have failed 19665 1727204185.13137: getting the remaining hosts for this loop 19665 1727204185.13142: done getting the remaining hosts for this loop 19665 1727204185.13145: getting the next task for host managed-node3 19665 1727204185.13152: done getting next task for host managed-node3 19665 1727204185.13156: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19665 1727204185.13158: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204185.13173: getting variables 19665 1727204185.13175: in VariableManager get_vars() 19665 1727204185.13214: Calling all_inventory to load vars for managed-node3 19665 1727204185.13217: Calling groups_inventory to load vars for managed-node3 19665 1727204185.13220: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204185.13235: Calling all_plugins_play to load vars for managed-node3 19665 1727204185.13238: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204185.13243: Calling groups_plugins_play to load vars for managed-node3 19665 1727204185.14670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204185.16646: done with get_vars() 19665 1727204185.16697: done getting variables 19665 1727204185.16760: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.156) 0:00:36.034 ***** 19665 1727204185.16795: entering _queue_task() for managed-node3/package 19665 1727204185.17129: worker is 1 (out of 1 available) 19665 1727204185.17145: exiting _queue_task() for managed-node3/package 19665 1727204185.17157: done queuing things up, now waiting for results queue to drain 19665 1727204185.17159: waiting for pending results... 19665 1727204185.17447: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 19665 1727204185.17544: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000063 19665 1727204185.17557: variable 'ansible_search_path' from source: unknown 19665 1727204185.17560: variable 'ansible_search_path' from source: unknown 19665 1727204185.17594: calling self._execute() 19665 1727204185.17673: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204185.17677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204185.17689: variable 'omit' from source: magic vars 19665 1727204185.17981: variable 'ansible_distribution_major_version' from source: facts 19665 1727204185.17991: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204185.18081: variable 'network_state' from source: role '' defaults 19665 1727204185.18089: Evaluated conditional (network_state != {}): False 19665 1727204185.18092: when evaluation is False, skipping this task 19665 1727204185.18095: _execute() done 19665 1727204185.18098: dumping result to json 19665 1727204185.18101: done dumping result, returning 19665 1727204185.18106: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [0affcd87-79f5-0dcc-3ea6-000000000063] 19665 1727204185.18112: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000063 19665 1727204185.18202: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000063 19665 1727204185.18206: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204185.18250: no more pending results, returning what we have 19665 1727204185.18254: results queue empty 19665 1727204185.18255: checking for any_errors_fatal 19665 1727204185.18262: done checking for any_errors_fatal 19665 1727204185.18263: checking for max_fail_percentage 19665 1727204185.18266: done checking for max_fail_percentage 19665 1727204185.18267: checking to see if all hosts have failed and the running result is not ok 19665 1727204185.18268: done checking to see if all hosts have failed 19665 1727204185.18268: getting the remaining hosts for this loop 19665 1727204185.18270: done getting the remaining hosts for this loop 19665 1727204185.18274: getting the next task for host managed-node3 19665 1727204185.18280: done getting next task for host managed-node3 19665 1727204185.18284: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19665 1727204185.18286: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204185.18303: getting variables 19665 1727204185.18305: in VariableManager get_vars() 19665 1727204185.18347: Calling all_inventory to load vars for managed-node3 19665 1727204185.18350: Calling groups_inventory to load vars for managed-node3 19665 1727204185.18355: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204185.18374: Calling all_plugins_play to load vars for managed-node3 19665 1727204185.18378: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204185.18382: Calling groups_plugins_play to load vars for managed-node3 19665 1727204185.20493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204185.23208: done with get_vars() 19665 1727204185.23244: done getting variables 19665 1727204185.23309: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.065) 0:00:36.100 ***** 19665 1727204185.23346: entering _queue_task() for managed-node3/package 19665 1727204185.23693: worker is 1 (out of 1 available) 19665 1727204185.23706: exiting _queue_task() for managed-node3/package 19665 1727204185.23725: done queuing things up, now waiting for results queue to drain 19665 1727204185.23727: waiting for pending results... 19665 1727204185.24022: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 19665 1727204185.24135: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000064 19665 1727204185.24151: variable 'ansible_search_path' from source: unknown 19665 1727204185.24161: variable 'ansible_search_path' from source: unknown 19665 1727204185.24201: calling self._execute() 19665 1727204185.24311: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204185.24315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204185.24326: variable 'omit' from source: magic vars 19665 1727204185.24779: variable 'ansible_distribution_major_version' from source: facts 19665 1727204185.24790: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204185.24929: variable 'network_state' from source: role '' defaults 19665 1727204185.24941: Evaluated conditional (network_state != {}): False 19665 1727204185.24947: when evaluation is False, skipping this task 19665 1727204185.24951: _execute() done 19665 1727204185.24953: dumping result to json 19665 1727204185.24955: done dumping result, returning 19665 1727204185.24966: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [0affcd87-79f5-0dcc-3ea6-000000000064] 19665 1727204185.24973: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000064 skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204185.25110: no more pending results, returning what we have 19665 1727204185.25114: results queue empty 19665 1727204185.25115: checking for any_errors_fatal 19665 1727204185.25123: done checking for any_errors_fatal 19665 1727204185.25124: checking for max_fail_percentage 19665 1727204185.25125: done checking for max_fail_percentage 19665 1727204185.25128: checking to see if all hosts have failed and the running result is not ok 19665 1727204185.25129: done checking to see if all hosts have failed 19665 1727204185.25130: getting the remaining hosts for this loop 19665 1727204185.25132: done getting the remaining hosts for this loop 19665 1727204185.25136: getting the next task for host managed-node3 19665 1727204185.25146: done getting next task for host managed-node3 19665 1727204185.25150: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19665 1727204185.25152: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204185.25170: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000064 19665 1727204185.25174: WORKER PROCESS EXITING 19665 1727204185.25183: getting variables 19665 1727204185.25185: in VariableManager get_vars() 19665 1727204185.25225: Calling all_inventory to load vars for managed-node3 19665 1727204185.25228: Calling groups_inventory to load vars for managed-node3 19665 1727204185.25230: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204185.25246: Calling all_plugins_play to load vars for managed-node3 19665 1727204185.25250: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204185.25253: Calling groups_plugins_play to load vars for managed-node3 19665 1727204185.27222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204185.30322: done with get_vars() 19665 1727204185.30437: done getting variables 19665 1727204185.30548: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.072) 0:00:36.172 ***** 19665 1727204185.30588: entering _queue_task() for managed-node3/service 19665 1727204185.31102: worker is 1 (out of 1 available) 19665 1727204185.31141: exiting _queue_task() for managed-node3/service 19665 1727204185.31154: done queuing things up, now waiting for results queue to drain 19665 1727204185.31191: waiting for pending results... 19665 1727204185.32779: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 19665 1727204185.33183: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000065 19665 1727204185.33257: variable 'ansible_search_path' from source: unknown 19665 1727204185.33261: variable 'ansible_search_path' from source: unknown 19665 1727204185.33402: calling self._execute() 19665 1727204185.33539: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204185.33547: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204185.33558: variable 'omit' from source: magic vars 19665 1727204185.34180: variable 'ansible_distribution_major_version' from source: facts 19665 1727204185.34311: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204185.34564: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204185.35006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204185.39404: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204185.39483: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204185.39524: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204185.39558: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204185.39589: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204185.39679: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204185.39779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204185.39843: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.39960: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204185.40061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204185.40114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204185.40137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204185.40191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.40236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204185.40256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204185.40355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204185.40383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204185.40410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.40454: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204185.40471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204185.40671: variable 'network_connections' from source: play vars 19665 1727204185.40683: variable 'profile' from source: play vars 19665 1727204185.40791: variable 'profile' from source: play vars 19665 1727204185.40794: variable 'interface' from source: set_fact 19665 1727204185.40877: variable 'interface' from source: set_fact 19665 1727204185.40959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204185.41230: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204185.41320: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204185.41376: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204185.41481: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204185.41591: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204185.41617: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204185.41644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.41678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204185.41754: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204185.42244: variable 'network_connections' from source: play vars 19665 1727204185.42280: variable 'profile' from source: play vars 19665 1727204185.42400: variable 'profile' from source: play vars 19665 1727204185.42403: variable 'interface' from source: set_fact 19665 1727204185.42534: variable 'interface' from source: set_fact 19665 1727204185.42567: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 19665 1727204185.42570: when evaluation is False, skipping this task 19665 1727204185.42573: _execute() done 19665 1727204185.42575: dumping result to json 19665 1727204185.42580: done dumping result, returning 19665 1727204185.42615: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [0affcd87-79f5-0dcc-3ea6-000000000065] 19665 1727204185.42626: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000065 19665 1727204185.42742: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000065 19665 1727204185.42746: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 19665 1727204185.42816: no more pending results, returning what we have 19665 1727204185.42821: results queue empty 19665 1727204185.42822: checking for any_errors_fatal 19665 1727204185.42849: done checking for any_errors_fatal 19665 1727204185.42873: checking for max_fail_percentage 19665 1727204185.42876: done checking for max_fail_percentage 19665 1727204185.42877: checking to see if all hosts have failed and the running result is not ok 19665 1727204185.42878: done checking to see if all hosts have failed 19665 1727204185.42878: getting the remaining hosts for this loop 19665 1727204185.42880: done getting the remaining hosts for this loop 19665 1727204185.42885: getting the next task for host managed-node3 19665 1727204185.42941: done getting next task for host managed-node3 19665 1727204185.42947: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19665 1727204185.42949: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204185.43023: getting variables 19665 1727204185.43026: in VariableManager get_vars() 19665 1727204185.43071: Calling all_inventory to load vars for managed-node3 19665 1727204185.43073: Calling groups_inventory to load vars for managed-node3 19665 1727204185.43097: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204185.43109: Calling all_plugins_play to load vars for managed-node3 19665 1727204185.43112: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204185.43115: Calling groups_plugins_play to load vars for managed-node3 19665 1727204185.46543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204185.48487: done with get_vars() 19665 1727204185.48514: done getting variables 19665 1727204185.48580: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:56:25 -0400 (0:00:00.180) 0:00:36.352 ***** 19665 1727204185.48610: entering _queue_task() for managed-node3/service 19665 1727204185.48896: worker is 1 (out of 1 available) 19665 1727204185.48910: exiting _queue_task() for managed-node3/service 19665 1727204185.48923: done queuing things up, now waiting for results queue to drain 19665 1727204185.48925: waiting for pending results... 19665 1727204185.49180: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 19665 1727204185.49333: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000066 19665 1727204185.49348: variable 'ansible_search_path' from source: unknown 19665 1727204185.49352: variable 'ansible_search_path' from source: unknown 19665 1727204185.49400: calling self._execute() 19665 1727204185.49500: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204185.49504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204185.49514: variable 'omit' from source: magic vars 19665 1727204185.49942: variable 'ansible_distribution_major_version' from source: facts 19665 1727204185.49949: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204185.50593: variable 'network_provider' from source: set_fact 19665 1727204185.50597: variable 'network_state' from source: role '' defaults 19665 1727204185.50599: Evaluated conditional (network_provider == "nm" or network_state != {}): True 19665 1727204185.50601: variable 'omit' from source: magic vars 19665 1727204185.50603: variable 'omit' from source: magic vars 19665 1727204185.50605: variable 'network_service_name' from source: role '' defaults 19665 1727204185.50607: variable 'network_service_name' from source: role '' defaults 19665 1727204185.50609: variable '__network_provider_setup' from source: role '' defaults 19665 1727204185.50611: variable '__network_service_name_default_nm' from source: role '' defaults 19665 1727204185.50613: variable '__network_service_name_default_nm' from source: role '' defaults 19665 1727204185.50615: variable '__network_packages_default_nm' from source: role '' defaults 19665 1727204185.50617: variable '__network_packages_default_nm' from source: role '' defaults 19665 1727204185.50985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204185.53465: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204185.53513: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204185.53540: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204185.53606: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204185.53624: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204185.53712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204185.53748: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204185.53779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.53823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204185.53850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204185.53904: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204185.53931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204185.53956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.53994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204185.54005: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204185.54350: variable '__network_packages_default_gobject_packages' from source: role '' defaults 19665 1727204185.54642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204185.54681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204185.54742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.54790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204185.54815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204185.55047: variable 'ansible_python' from source: facts 19665 1727204185.55096: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 19665 1727204185.55300: variable '__network_wpa_supplicant_required' from source: role '' defaults 19665 1727204185.55472: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19665 1727204185.55577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204185.55629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204185.55642: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.55676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204185.55684: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204185.55718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204185.55738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204185.55757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.55787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204185.55797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204185.55893: variable 'network_connections' from source: play vars 19665 1727204185.55900: variable 'profile' from source: play vars 19665 1727204185.55955: variable 'profile' from source: play vars 19665 1727204185.55960: variable 'interface' from source: set_fact 19665 1727204185.56007: variable 'interface' from source: set_fact 19665 1727204185.56131: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204185.56265: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204185.56304: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204185.56337: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204185.56382: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204185.56431: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204185.56457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204185.56482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204185.56508: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204185.56547: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204185.56732: variable 'network_connections' from source: play vars 19665 1727204185.56737: variable 'profile' from source: play vars 19665 1727204185.56797: variable 'profile' from source: play vars 19665 1727204185.56800: variable 'interface' from source: set_fact 19665 1727204185.56846: variable 'interface' from source: set_fact 19665 1727204185.56875: variable '__network_packages_default_wireless' from source: role '' defaults 19665 1727204185.56930: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204185.57130: variable 'network_connections' from source: play vars 19665 1727204185.57133: variable 'profile' from source: play vars 19665 1727204185.57191: variable 'profile' from source: play vars 19665 1727204185.57194: variable 'interface' from source: set_fact 19665 1727204185.57249: variable 'interface' from source: set_fact 19665 1727204185.57270: variable '__network_packages_default_team' from source: role '' defaults 19665 1727204185.57326: variable '__network_team_connections_defined' from source: role '' defaults 19665 1727204185.57521: variable 'network_connections' from source: play vars 19665 1727204185.57525: variable 'profile' from source: play vars 19665 1727204185.57578: variable 'profile' from source: play vars 19665 1727204185.57581: variable 'interface' from source: set_fact 19665 1727204185.57636: variable 'interface' from source: set_fact 19665 1727204185.57680: variable '__network_service_name_default_initscripts' from source: role '' defaults 19665 1727204185.57722: variable '__network_service_name_default_initscripts' from source: role '' defaults 19665 1727204185.57729: variable '__network_packages_default_initscripts' from source: role '' defaults 19665 1727204185.57775: variable '__network_packages_default_initscripts' from source: role '' defaults 19665 1727204185.57912: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 19665 1727204185.58235: variable 'network_connections' from source: play vars 19665 1727204185.58238: variable 'profile' from source: play vars 19665 1727204185.58285: variable 'profile' from source: play vars 19665 1727204185.58289: variable 'interface' from source: set_fact 19665 1727204185.58337: variable 'interface' from source: set_fact 19665 1727204185.58346: variable 'ansible_distribution' from source: facts 19665 1727204185.58354: variable '__network_rh_distros' from source: role '' defaults 19665 1727204185.58357: variable 'ansible_distribution_major_version' from source: facts 19665 1727204185.58370: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 19665 1727204185.58484: variable 'ansible_distribution' from source: facts 19665 1727204185.58487: variable '__network_rh_distros' from source: role '' defaults 19665 1727204185.58491: variable 'ansible_distribution_major_version' from source: facts 19665 1727204185.58502: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 19665 1727204185.58621: variable 'ansible_distribution' from source: facts 19665 1727204185.58625: variable '__network_rh_distros' from source: role '' defaults 19665 1727204185.58627: variable 'ansible_distribution_major_version' from source: facts 19665 1727204185.58655: variable 'network_provider' from source: set_fact 19665 1727204185.58674: variable 'omit' from source: magic vars 19665 1727204185.58699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204185.58720: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204185.58737: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204185.58752: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204185.58760: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204185.58786: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204185.58792: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204185.58794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204185.58861: Set connection var ansible_connection to ssh 19665 1727204185.58868: Set connection var ansible_shell_type to sh 19665 1727204185.58874: Set connection var ansible_timeout to 10 19665 1727204185.58879: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204185.58886: Set connection var ansible_shell_executable to /bin/sh 19665 1727204185.58892: Set connection var ansible_pipelining to False 19665 1727204185.58911: variable 'ansible_shell_executable' from source: unknown 19665 1727204185.58914: variable 'ansible_connection' from source: unknown 19665 1727204185.58917: variable 'ansible_module_compression' from source: unknown 19665 1727204185.58919: variable 'ansible_shell_type' from source: unknown 19665 1727204185.58922: variable 'ansible_shell_executable' from source: unknown 19665 1727204185.58924: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204185.58931: variable 'ansible_pipelining' from source: unknown 19665 1727204185.58933: variable 'ansible_timeout' from source: unknown 19665 1727204185.58935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204185.59011: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204185.59020: variable 'omit' from source: magic vars 19665 1727204185.59023: starting attempt loop 19665 1727204185.59025: running the handler 19665 1727204185.59084: variable 'ansible_facts' from source: unknown 19665 1727204185.59588: _low_level_execute_command(): starting 19665 1727204185.59596: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204185.60133: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204185.60150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204185.60176: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204185.60188: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204185.60199: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204185.60245: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204185.60265: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204185.60325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204185.61944: stdout chunk (state=3): >>>/root <<< 19665 1727204185.62050: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204185.62104: stderr chunk (state=3): >>><<< 19665 1727204185.62108: stdout chunk (state=3): >>><<< 19665 1727204185.62125: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204185.62135: _low_level_execute_command(): starting 19665 1727204185.62148: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204185.6212547-22628-23717193840220 `" && echo ansible-tmp-1727204185.6212547-22628-23717193840220="` echo /root/.ansible/tmp/ansible-tmp-1727204185.6212547-22628-23717193840220 `" ) && sleep 0' 19665 1727204185.62607: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204185.62618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204185.62646: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204185.62658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204185.62670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204185.62716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204185.62728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204185.62785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204185.64609: stdout chunk (state=3): >>>ansible-tmp-1727204185.6212547-22628-23717193840220=/root/.ansible/tmp/ansible-tmp-1727204185.6212547-22628-23717193840220 <<< 19665 1727204185.64725: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204185.64781: stderr chunk (state=3): >>><<< 19665 1727204185.64784: stdout chunk (state=3): >>><<< 19665 1727204185.64799: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204185.6212547-22628-23717193840220=/root/.ansible/tmp/ansible-tmp-1727204185.6212547-22628-23717193840220 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204185.64829: variable 'ansible_module_compression' from source: unknown 19665 1727204185.64878: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 19665 1727204185.64925: variable 'ansible_facts' from source: unknown 19665 1727204185.65065: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204185.6212547-22628-23717193840220/AnsiballZ_systemd.py 19665 1727204185.65190: Sending initial data 19665 1727204185.65193: Sent initial data (155 bytes) 19665 1727204185.65899: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204185.65904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204185.65940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204185.65954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204185.65967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204185.66010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204185.66029: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204185.66082: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204185.67776: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204185.67812: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204185.67859: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp6ljgl3gy /root/.ansible/tmp/ansible-tmp-1727204185.6212547-22628-23717193840220/AnsiballZ_systemd.py <<< 19665 1727204185.67890: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204185.69651: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204185.69770: stderr chunk (state=3): >>><<< 19665 1727204185.69774: stdout chunk (state=3): >>><<< 19665 1727204185.69791: done transferring module to remote 19665 1727204185.69802: _low_level_execute_command(): starting 19665 1727204185.69805: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204185.6212547-22628-23717193840220/ /root/.ansible/tmp/ansible-tmp-1727204185.6212547-22628-23717193840220/AnsiballZ_systemd.py && sleep 0' 19665 1727204185.70274: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204185.70280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204185.70313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204185.70326: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204185.70420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204185.70438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204185.70510: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204185.72200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204185.72246: stderr chunk (state=3): >>><<< 19665 1727204185.72249: stdout chunk (state=3): >>><<< 19665 1727204185.72262: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204185.72268: _low_level_execute_command(): starting 19665 1727204185.72271: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204185.6212547-22628-23717193840220/AnsiballZ_systemd.py && sleep 0' 19665 1727204185.72711: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204185.72723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204185.72779: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204185.72815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204185.72822: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204185.72916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204185.98150: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "16134144", "MemoryAvailable": "infinity", "CPUUsageNSec": "1413041000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 19665 1727204185.99595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204185.99599: stdout chunk (state=3): >>><<< 19665 1727204185.99601: stderr chunk (state=3): >>><<< 19665 1727204185.99894: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "NotifyAccess": "none", "RestartUSec": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "616", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ExecMainStartTimestampMonotonic": "12637094", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "616", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2418", "MemoryCurrent": "16134144", "MemoryAvailable": "infinity", "CPUUsageNSec": "1413041000", "TasksCurrent": "3", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "MemoryHigh": "infinity", "MemoryMax": "infinity", "MemorySwapMax": "infinity", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22342", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13964", "LimitNPROCSoft": "13964", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13964", "LimitSIGPENDINGSoft": "13964", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service shutdown.target multi-user.target network.target cloud-init.service NetworkManager-wait-online.service", "After": "dbus-broker.service systemd-journald.socket sysinit.target network-pre.target system.slice cloud-init-local.service basic.target dbus.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:50 EDT", "StateChangeTimestampMonotonic": "376906768", "InactiveExitTimestamp": "Tue 2024-09-24 14:47:46 EDT", "InactiveExitTimestampMonotonic": "12637298", "ActiveEnterTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ActiveEnterTimestampMonotonic": "12973041", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:47:46 EDT", "ConditionTimestampMonotonic": "12630855", "AssertTimestamp": "Tue 2024-09-24 14:47:46 EDT", "AssertTimestampMonotonic": "12630857", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "f94263a9def7408cb754f60792d8c658", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204185.99904: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204185.6212547-22628-23717193840220/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204185.99906: _low_level_execute_command(): starting 19665 1727204185.99909: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204185.6212547-22628-23717193840220/ > /dev/null 2>&1 && sleep 0' 19665 1727204186.00898: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204186.00905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204186.00923: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204186.00941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204186.00980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204186.00986: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204186.00996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204186.01009: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204186.01017: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204186.01026: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204186.01036: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204186.01052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204186.01063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204186.01072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204186.01079: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204186.01088: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204186.01165: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204186.01184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204186.01196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204186.01264: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204186.03146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204186.03150: stdout chunk (state=3): >>><<< 19665 1727204186.03154: stderr chunk (state=3): >>><<< 19665 1727204186.03175: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204186.03183: handler run complete 19665 1727204186.03259: attempt loop complete, returning result 19665 1727204186.03262: _execute() done 19665 1727204186.03266: dumping result to json 19665 1727204186.03286: done dumping result, returning 19665 1727204186.03296: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [0affcd87-79f5-0dcc-3ea6-000000000066] 19665 1727204186.03298: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000066 19665 1727204186.03531: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000066 19665 1727204186.03535: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204186.03620: no more pending results, returning what we have 19665 1727204186.03624: results queue empty 19665 1727204186.03625: checking for any_errors_fatal 19665 1727204186.03633: done checking for any_errors_fatal 19665 1727204186.03633: checking for max_fail_percentage 19665 1727204186.03636: done checking for max_fail_percentage 19665 1727204186.03637: checking to see if all hosts have failed and the running result is not ok 19665 1727204186.03638: done checking to see if all hosts have failed 19665 1727204186.03638: getting the remaining hosts for this loop 19665 1727204186.03640: done getting the remaining hosts for this loop 19665 1727204186.03645: getting the next task for host managed-node3 19665 1727204186.03652: done getting next task for host managed-node3 19665 1727204186.03657: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19665 1727204186.03659: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204186.03672: getting variables 19665 1727204186.03674: in VariableManager get_vars() 19665 1727204186.03713: Calling all_inventory to load vars for managed-node3 19665 1727204186.03716: Calling groups_inventory to load vars for managed-node3 19665 1727204186.03718: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204186.03728: Calling all_plugins_play to load vars for managed-node3 19665 1727204186.03731: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204186.03733: Calling groups_plugins_play to load vars for managed-node3 19665 1727204186.05352: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204186.07066: done with get_vars() 19665 1727204186.07096: done getting variables 19665 1727204186.07163: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.585) 0:00:36.938 ***** 19665 1727204186.07204: entering _queue_task() for managed-node3/service 19665 1727204186.07533: worker is 1 (out of 1 available) 19665 1727204186.07552: exiting _queue_task() for managed-node3/service 19665 1727204186.07565: done queuing things up, now waiting for results queue to drain 19665 1727204186.07567: waiting for pending results... 19665 1727204186.07857: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 19665 1727204186.07961: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000067 19665 1727204186.07975: variable 'ansible_search_path' from source: unknown 19665 1727204186.07982: variable 'ansible_search_path' from source: unknown 19665 1727204186.08020: calling self._execute() 19665 1727204186.08118: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204186.08124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204186.08134: variable 'omit' from source: magic vars 19665 1727204186.08521: variable 'ansible_distribution_major_version' from source: facts 19665 1727204186.08538: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204186.08668: variable 'network_provider' from source: set_fact 19665 1727204186.08673: Evaluated conditional (network_provider == "nm"): True 19665 1727204186.08772: variable '__network_wpa_supplicant_required' from source: role '' defaults 19665 1727204186.08868: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 19665 1727204186.09046: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204186.11834: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204186.11910: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204186.11946: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204186.11982: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204186.12016: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204186.12105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204186.12141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204186.12166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204186.12211: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204186.12228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204186.12282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204186.12305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204186.12335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204186.12379: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204186.12393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204186.12438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204186.12468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204186.12494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204186.12537: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204186.12552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204186.13169: variable 'network_connections' from source: play vars 19665 1727204186.13172: variable 'profile' from source: play vars 19665 1727204186.13175: variable 'profile' from source: play vars 19665 1727204186.13177: variable 'interface' from source: set_fact 19665 1727204186.13179: variable 'interface' from source: set_fact 19665 1727204186.13181: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 19665 1727204186.13183: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 19665 1727204186.13185: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 19665 1727204186.13223: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 19665 1727204186.13252: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 19665 1727204186.13301: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 19665 1727204186.13328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 19665 1727204186.13354: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204186.13381: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 19665 1727204186.13434: variable '__network_wireless_connections_defined' from source: role '' defaults 19665 1727204186.13652: variable 'network_connections' from source: play vars 19665 1727204186.13656: variable 'profile' from source: play vars 19665 1727204186.13718: variable 'profile' from source: play vars 19665 1727204186.13721: variable 'interface' from source: set_fact 19665 1727204186.13790: variable 'interface' from source: set_fact 19665 1727204186.13818: Evaluated conditional (__network_wpa_supplicant_required): False 19665 1727204186.13821: when evaluation is False, skipping this task 19665 1727204186.13823: _execute() done 19665 1727204186.13834: dumping result to json 19665 1727204186.13837: done dumping result, returning 19665 1727204186.13841: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [0affcd87-79f5-0dcc-3ea6-000000000067] 19665 1727204186.13846: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000067 19665 1727204186.13945: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000067 19665 1727204186.13948: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 19665 1727204186.14010: no more pending results, returning what we have 19665 1727204186.14014: results queue empty 19665 1727204186.14016: checking for any_errors_fatal 19665 1727204186.14032: done checking for any_errors_fatal 19665 1727204186.14033: checking for max_fail_percentage 19665 1727204186.14035: done checking for max_fail_percentage 19665 1727204186.14035: checking to see if all hosts have failed and the running result is not ok 19665 1727204186.14036: done checking to see if all hosts have failed 19665 1727204186.14037: getting the remaining hosts for this loop 19665 1727204186.14039: done getting the remaining hosts for this loop 19665 1727204186.14043: getting the next task for host managed-node3 19665 1727204186.14051: done getting next task for host managed-node3 19665 1727204186.14058: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 19665 1727204186.14060: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204186.14076: getting variables 19665 1727204186.14078: in VariableManager get_vars() 19665 1727204186.14120: Calling all_inventory to load vars for managed-node3 19665 1727204186.14123: Calling groups_inventory to load vars for managed-node3 19665 1727204186.14126: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204186.14137: Calling all_plugins_play to load vars for managed-node3 19665 1727204186.14140: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204186.14144: Calling groups_plugins_play to load vars for managed-node3 19665 1727204186.15989: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204186.17666: done with get_vars() 19665 1727204186.17694: done getting variables 19665 1727204186.17763: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.105) 0:00:37.044 ***** 19665 1727204186.17796: entering _queue_task() for managed-node3/service 19665 1727204186.18143: worker is 1 (out of 1 available) 19665 1727204186.18162: exiting _queue_task() for managed-node3/service 19665 1727204186.18177: done queuing things up, now waiting for results queue to drain 19665 1727204186.18179: waiting for pending results... 19665 1727204186.18481: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 19665 1727204186.18588: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000068 19665 1727204186.18605: variable 'ansible_search_path' from source: unknown 19665 1727204186.18609: variable 'ansible_search_path' from source: unknown 19665 1727204186.18649: calling self._execute() 19665 1727204186.18754: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204186.18758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204186.18771: variable 'omit' from source: magic vars 19665 1727204186.19182: variable 'ansible_distribution_major_version' from source: facts 19665 1727204186.19196: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204186.19322: variable 'network_provider' from source: set_fact 19665 1727204186.19325: Evaluated conditional (network_provider == "initscripts"): False 19665 1727204186.19330: when evaluation is False, skipping this task 19665 1727204186.19333: _execute() done 19665 1727204186.19335: dumping result to json 19665 1727204186.19342: done dumping result, returning 19665 1727204186.19345: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [0affcd87-79f5-0dcc-3ea6-000000000068] 19665 1727204186.19353: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000068 19665 1727204186.19456: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000068 19665 1727204186.19458: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 19665 1727204186.19531: no more pending results, returning what we have 19665 1727204186.19535: results queue empty 19665 1727204186.19536: checking for any_errors_fatal 19665 1727204186.19545: done checking for any_errors_fatal 19665 1727204186.19546: checking for max_fail_percentage 19665 1727204186.19548: done checking for max_fail_percentage 19665 1727204186.19549: checking to see if all hosts have failed and the running result is not ok 19665 1727204186.19550: done checking to see if all hosts have failed 19665 1727204186.19551: getting the remaining hosts for this loop 19665 1727204186.19553: done getting the remaining hosts for this loop 19665 1727204186.19558: getting the next task for host managed-node3 19665 1727204186.19567: done getting next task for host managed-node3 19665 1727204186.19571: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19665 1727204186.19574: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204186.19590: getting variables 19665 1727204186.19592: in VariableManager get_vars() 19665 1727204186.19629: Calling all_inventory to load vars for managed-node3 19665 1727204186.19631: Calling groups_inventory to load vars for managed-node3 19665 1727204186.19633: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204186.19645: Calling all_plugins_play to load vars for managed-node3 19665 1727204186.19648: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204186.19651: Calling groups_plugins_play to load vars for managed-node3 19665 1727204186.21253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204186.22947: done with get_vars() 19665 1727204186.22980: done getting variables 19665 1727204186.23046: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.052) 0:00:37.097 ***** 19665 1727204186.23082: entering _queue_task() for managed-node3/copy 19665 1727204186.23411: worker is 1 (out of 1 available) 19665 1727204186.23425: exiting _queue_task() for managed-node3/copy 19665 1727204186.23438: done queuing things up, now waiting for results queue to drain 19665 1727204186.23439: waiting for pending results... 19665 1727204186.23731: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 19665 1727204186.23837: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000069 19665 1727204186.23850: variable 'ansible_search_path' from source: unknown 19665 1727204186.23853: variable 'ansible_search_path' from source: unknown 19665 1727204186.23900: calling self._execute() 19665 1727204186.23994: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204186.24003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204186.24015: variable 'omit' from source: magic vars 19665 1727204186.24399: variable 'ansible_distribution_major_version' from source: facts 19665 1727204186.24411: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204186.24547: variable 'network_provider' from source: set_fact 19665 1727204186.24554: Evaluated conditional (network_provider == "initscripts"): False 19665 1727204186.24557: when evaluation is False, skipping this task 19665 1727204186.24560: _execute() done 19665 1727204186.24562: dumping result to json 19665 1727204186.24566: done dumping result, returning 19665 1727204186.24574: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [0affcd87-79f5-0dcc-3ea6-000000000069] 19665 1727204186.24580: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000069 skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 19665 1727204186.24723: no more pending results, returning what we have 19665 1727204186.24728: results queue empty 19665 1727204186.24729: checking for any_errors_fatal 19665 1727204186.24734: done checking for any_errors_fatal 19665 1727204186.24735: checking for max_fail_percentage 19665 1727204186.24737: done checking for max_fail_percentage 19665 1727204186.24738: checking to see if all hosts have failed and the running result is not ok 19665 1727204186.24738: done checking to see if all hosts have failed 19665 1727204186.24739: getting the remaining hosts for this loop 19665 1727204186.24741: done getting the remaining hosts for this loop 19665 1727204186.24746: getting the next task for host managed-node3 19665 1727204186.24755: done getting next task for host managed-node3 19665 1727204186.24759: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19665 1727204186.24762: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204186.24780: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000069 19665 1727204186.24789: getting variables 19665 1727204186.24791: in VariableManager get_vars() 19665 1727204186.24832: Calling all_inventory to load vars for managed-node3 19665 1727204186.24835: Calling groups_inventory to load vars for managed-node3 19665 1727204186.24838: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204186.24852: Calling all_plugins_play to load vars for managed-node3 19665 1727204186.24855: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204186.24859: Calling groups_plugins_play to load vars for managed-node3 19665 1727204186.25682: WORKER PROCESS EXITING 19665 1727204186.31481: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204186.33155: done with get_vars() 19665 1727204186.33185: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.101) 0:00:37.199 ***** 19665 1727204186.33264: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 19665 1727204186.33596: worker is 1 (out of 1 available) 19665 1727204186.33609: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 19665 1727204186.33622: done queuing things up, now waiting for results queue to drain 19665 1727204186.33624: waiting for pending results... 19665 1727204186.33918: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 19665 1727204186.34031: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000006a 19665 1727204186.34045: variable 'ansible_search_path' from source: unknown 19665 1727204186.34049: variable 'ansible_search_path' from source: unknown 19665 1727204186.34090: calling self._execute() 19665 1727204186.34193: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204186.34197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204186.34210: variable 'omit' from source: magic vars 19665 1727204186.34620: variable 'ansible_distribution_major_version' from source: facts 19665 1727204186.34634: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204186.34643: variable 'omit' from source: magic vars 19665 1727204186.34689: variable 'omit' from source: magic vars 19665 1727204186.34861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 19665 1727204186.37224: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 19665 1727204186.37387: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 19665 1727204186.37429: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 19665 1727204186.37469: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 19665 1727204186.37611: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 19665 1727204186.37706: variable 'network_provider' from source: set_fact 19665 1727204186.38159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 19665 1727204186.38193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 19665 1727204186.38219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 19665 1727204186.38352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 19665 1727204186.38411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 19665 1727204186.38573: variable 'omit' from source: magic vars 19665 1727204186.38924: variable 'omit' from source: magic vars 19665 1727204186.39152: variable 'network_connections' from source: play vars 19665 1727204186.39169: variable 'profile' from source: play vars 19665 1727204186.39235: variable 'profile' from source: play vars 19665 1727204186.39348: variable 'interface' from source: set_fact 19665 1727204186.39415: variable 'interface' from source: set_fact 19665 1727204186.39798: variable 'omit' from source: magic vars 19665 1727204186.39807: variable '__lsr_ansible_managed' from source: task vars 19665 1727204186.39874: variable '__lsr_ansible_managed' from source: task vars 19665 1727204186.40305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 19665 1727204186.40587: Loaded config def from plugin (lookup/template) 19665 1727204186.40598: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 19665 1727204186.40629: File lookup term: get_ansible_managed.j2 19665 1727204186.40638: variable 'ansible_search_path' from source: unknown 19665 1727204186.40652: evaluation_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 19665 1727204186.40677: search_path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 19665 1727204186.40704: variable 'ansible_search_path' from source: unknown 19665 1727204186.47528: variable 'ansible_managed' from source: unknown 19665 1727204186.47689: variable 'omit' from source: magic vars 19665 1727204186.47729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204186.47767: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204186.47794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204186.47817: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204186.47833: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204186.47872: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204186.47883: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204186.47893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204186.47994: Set connection var ansible_connection to ssh 19665 1727204186.48009: Set connection var ansible_shell_type to sh 19665 1727204186.48020: Set connection var ansible_timeout to 10 19665 1727204186.48030: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204186.48045: Set connection var ansible_shell_executable to /bin/sh 19665 1727204186.48058: Set connection var ansible_pipelining to False 19665 1727204186.48088: variable 'ansible_shell_executable' from source: unknown 19665 1727204186.48096: variable 'ansible_connection' from source: unknown 19665 1727204186.48103: variable 'ansible_module_compression' from source: unknown 19665 1727204186.48112: variable 'ansible_shell_type' from source: unknown 19665 1727204186.48120: variable 'ansible_shell_executable' from source: unknown 19665 1727204186.48127: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204186.48135: variable 'ansible_pipelining' from source: unknown 19665 1727204186.48144: variable 'ansible_timeout' from source: unknown 19665 1727204186.48152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204186.48294: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204186.48320: variable 'omit' from source: magic vars 19665 1727204186.48330: starting attempt loop 19665 1727204186.48338: running the handler 19665 1727204186.48358: _low_level_execute_command(): starting 19665 1727204186.48372: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204186.49141: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204186.49160: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204186.49178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204186.49199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204186.49249: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204186.49266: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204186.49281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204186.49299: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204186.49313: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204186.49326: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204186.49342: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204186.49357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204186.49376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204186.49389: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204186.49400: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204186.49415: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204186.49493: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204186.49510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204186.49526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204186.49611: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204186.51246: stdout chunk (state=3): >>>/root <<< 19665 1727204186.51352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204186.51494: stderr chunk (state=3): >>><<< 19665 1727204186.51510: stdout chunk (state=3): >>><<< 19665 1727204186.51570: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204186.51574: _low_level_execute_command(): starting 19665 1727204186.51576: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204186.5154603-22655-153304612831214 `" && echo ansible-tmp-1727204186.5154603-22655-153304612831214="` echo /root/.ansible/tmp/ansible-tmp-1727204186.5154603-22655-153304612831214 `" ) && sleep 0' 19665 1727204186.52401: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204186.52423: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204186.52454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204186.52477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204186.52529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204186.52552: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204186.52590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204186.52608: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204186.52953: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204186.52957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204186.53031: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204186.53034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204186.53049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204186.53131: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204186.54997: stdout chunk (state=3): >>>ansible-tmp-1727204186.5154603-22655-153304612831214=/root/.ansible/tmp/ansible-tmp-1727204186.5154603-22655-153304612831214 <<< 19665 1727204186.55186: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204186.55234: stderr chunk (state=3): >>><<< 19665 1727204186.55237: stdout chunk (state=3): >>><<< 19665 1727204186.55376: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204186.5154603-22655-153304612831214=/root/.ansible/tmp/ansible-tmp-1727204186.5154603-22655-153304612831214 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204186.55380: variable 'ansible_module_compression' from source: unknown 19665 1727204186.55388: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 19665 1727204186.55484: variable 'ansible_facts' from source: unknown 19665 1727204186.55526: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204186.5154603-22655-153304612831214/AnsiballZ_network_connections.py 19665 1727204186.55712: Sending initial data 19665 1727204186.55715: Sent initial data (168 bytes) 19665 1727204186.56787: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204186.56809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204186.56826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204186.56849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204186.56895: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204186.56914: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204186.56930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204186.56952: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204186.56968: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204186.56981: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204186.56994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204186.57008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204186.57031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204186.57047: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204186.57060: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204186.57078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204186.57162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204186.57188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204186.57204: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204186.57283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204186.59007: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204186.59053: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204186.59073: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpsozewyqm /root/.ansible/tmp/ansible-tmp-1727204186.5154603-22655-153304612831214/AnsiballZ_network_connections.py <<< 19665 1727204186.59107: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204186.60782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204186.60995: stderr chunk (state=3): >>><<< 19665 1727204186.61122: stdout chunk (state=3): >>><<< 19665 1727204186.61125: done transferring module to remote 19665 1727204186.61128: _low_level_execute_command(): starting 19665 1727204186.61130: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204186.5154603-22655-153304612831214/ /root/.ansible/tmp/ansible-tmp-1727204186.5154603-22655-153304612831214/AnsiballZ_network_connections.py && sleep 0' 19665 1727204186.61853: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204186.61870: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204186.61896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204186.61916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204186.61965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204186.61980: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204186.62005: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204186.62025: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204186.62038: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204186.62062: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204186.62078: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204186.62095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204186.62124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204186.62146: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204186.62159: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204186.62176: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204186.62272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204186.62294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204186.62310: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204186.62391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204186.64111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204186.64214: stderr chunk (state=3): >>><<< 19665 1727204186.64224: stdout chunk (state=3): >>><<< 19665 1727204186.64337: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204186.64348: _low_level_execute_command(): starting 19665 1727204186.64350: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204186.5154603-22655-153304612831214/AnsiballZ_network_connections.py && sleep 0' 19665 1727204186.65337: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204186.65343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204186.65378: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204186.65382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204186.65384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204186.65452: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204186.65469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204186.65551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204186.89285: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_r158luct/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_r158luct/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/fe68f071-1086-45ef-92de-86b998c54595: error=unknown <<< 19665 1727204186.89429: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 19665 1727204186.90924: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204186.90983: stderr chunk (state=3): >>><<< 19665 1727204186.90987: stdout chunk (state=3): >>><<< 19665 1727204186.91001: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_r158luct/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_r158luct/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on LSR-TST-br31/fe68f071-1086-45ef-92de-86b998c54595: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "LSR-TST-br31", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204186.91035: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'LSR-TST-br31', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204186.5154603-22655-153304612831214/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204186.91044: _low_level_execute_command(): starting 19665 1727204186.91049: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204186.5154603-22655-153304612831214/ > /dev/null 2>&1 && sleep 0' 19665 1727204186.91743: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204186.91759: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204186.91778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204186.91800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204186.91842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204186.91854: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204186.91870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204186.91897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204186.91910: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204186.91921: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204186.91932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204186.91946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204186.91962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204186.91979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204186.91992: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204186.92019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204186.92137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204186.92170: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204186.92185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204186.92270: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204186.94041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204186.94104: stderr chunk (state=3): >>><<< 19665 1727204186.94108: stdout chunk (state=3): >>><<< 19665 1727204186.94123: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204186.94128: handler run complete 19665 1727204186.94151: attempt loop complete, returning result 19665 1727204186.94154: _execute() done 19665 1727204186.94158: dumping result to json 19665 1727204186.94160: done dumping result, returning 19665 1727204186.94172: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [0affcd87-79f5-0dcc-3ea6-00000000006a] 19665 1727204186.94177: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000006a 19665 1727204186.94334: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000006a 19665 1727204186.94337: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 19665 1727204186.94427: no more pending results, returning what we have 19665 1727204186.94432: results queue empty 19665 1727204186.94433: checking for any_errors_fatal 19665 1727204186.94440: done checking for any_errors_fatal 19665 1727204186.94441: checking for max_fail_percentage 19665 1727204186.94443: done checking for max_fail_percentage 19665 1727204186.94443: checking to see if all hosts have failed and the running result is not ok 19665 1727204186.94444: done checking to see if all hosts have failed 19665 1727204186.94445: getting the remaining hosts for this loop 19665 1727204186.94447: done getting the remaining hosts for this loop 19665 1727204186.94451: getting the next task for host managed-node3 19665 1727204186.94457: done getting next task for host managed-node3 19665 1727204186.94461: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 19665 1727204186.94462: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204186.94474: getting variables 19665 1727204186.94475: in VariableManager get_vars() 19665 1727204186.94508: Calling all_inventory to load vars for managed-node3 19665 1727204186.94511: Calling groups_inventory to load vars for managed-node3 19665 1727204186.94513: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204186.94521: Calling all_plugins_play to load vars for managed-node3 19665 1727204186.94524: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204186.94526: Calling groups_plugins_play to load vars for managed-node3 19665 1727204186.96045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204186.97389: done with get_vars() 19665 1727204186.97409: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:56:26 -0400 (0:00:00.642) 0:00:37.841 ***** 19665 1727204186.97477: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 19665 1727204186.97719: worker is 1 (out of 1 available) 19665 1727204186.97734: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 19665 1727204186.97746: done queuing things up, now waiting for results queue to drain 19665 1727204186.97748: waiting for pending results... 19665 1727204186.97943: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 19665 1727204186.98022: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000006b 19665 1727204186.98032: variable 'ansible_search_path' from source: unknown 19665 1727204186.98036: variable 'ansible_search_path' from source: unknown 19665 1727204186.98069: calling self._execute() 19665 1727204186.98150: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204186.98157: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204186.98169: variable 'omit' from source: magic vars 19665 1727204186.98540: variable 'ansible_distribution_major_version' from source: facts 19665 1727204186.98545: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204186.98891: variable 'network_state' from source: role '' defaults 19665 1727204186.98894: Evaluated conditional (network_state != {}): False 19665 1727204186.98897: when evaluation is False, skipping this task 19665 1727204186.98899: _execute() done 19665 1727204186.98901: dumping result to json 19665 1727204186.98903: done dumping result, returning 19665 1727204186.98905: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [0affcd87-79f5-0dcc-3ea6-00000000006b] 19665 1727204186.98907: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000006b 19665 1727204186.99000: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000006b 19665 1727204186.99003: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 19665 1727204186.99046: no more pending results, returning what we have 19665 1727204186.99049: results queue empty 19665 1727204186.99050: checking for any_errors_fatal 19665 1727204186.99059: done checking for any_errors_fatal 19665 1727204186.99059: checking for max_fail_percentage 19665 1727204186.99061: done checking for max_fail_percentage 19665 1727204186.99062: checking to see if all hosts have failed and the running result is not ok 19665 1727204186.99062: done checking to see if all hosts have failed 19665 1727204186.99063: getting the remaining hosts for this loop 19665 1727204186.99066: done getting the remaining hosts for this loop 19665 1727204186.99070: getting the next task for host managed-node3 19665 1727204186.99075: done getting next task for host managed-node3 19665 1727204186.99079: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19665 1727204186.99081: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204186.99095: getting variables 19665 1727204186.99096: in VariableManager get_vars() 19665 1727204186.99131: Calling all_inventory to load vars for managed-node3 19665 1727204186.99134: Calling groups_inventory to load vars for managed-node3 19665 1727204186.99136: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204186.99147: Calling all_plugins_play to load vars for managed-node3 19665 1727204186.99150: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204186.99152: Calling groups_plugins_play to load vars for managed-node3 19665 1727204187.00554: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204187.02056: done with get_vars() 19665 1727204187.02083: done getting variables 19665 1727204187.02148: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.047) 0:00:37.888 ***** 19665 1727204187.02184: entering _queue_task() for managed-node3/debug 19665 1727204187.02519: worker is 1 (out of 1 available) 19665 1727204187.02532: exiting _queue_task() for managed-node3/debug 19665 1727204187.02546: done queuing things up, now waiting for results queue to drain 19665 1727204187.02548: waiting for pending results... 19665 1727204187.02771: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 19665 1727204187.02866: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000006c 19665 1727204187.02876: variable 'ansible_search_path' from source: unknown 19665 1727204187.02879: variable 'ansible_search_path' from source: unknown 19665 1727204187.02910: calling self._execute() 19665 1727204187.02993: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204187.02997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204187.03006: variable 'omit' from source: magic vars 19665 1727204187.03301: variable 'ansible_distribution_major_version' from source: facts 19665 1727204187.03312: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204187.03317: variable 'omit' from source: magic vars 19665 1727204187.03353: variable 'omit' from source: magic vars 19665 1727204187.03379: variable 'omit' from source: magic vars 19665 1727204187.03411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204187.03437: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204187.03461: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204187.03474: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204187.03484: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204187.03508: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204187.03511: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204187.03514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204187.03587: Set connection var ansible_connection to ssh 19665 1727204187.03593: Set connection var ansible_shell_type to sh 19665 1727204187.03599: Set connection var ansible_timeout to 10 19665 1727204187.03604: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204187.03610: Set connection var ansible_shell_executable to /bin/sh 19665 1727204187.03617: Set connection var ansible_pipelining to False 19665 1727204187.03634: variable 'ansible_shell_executable' from source: unknown 19665 1727204187.03637: variable 'ansible_connection' from source: unknown 19665 1727204187.03640: variable 'ansible_module_compression' from source: unknown 19665 1727204187.03645: variable 'ansible_shell_type' from source: unknown 19665 1727204187.03649: variable 'ansible_shell_executable' from source: unknown 19665 1727204187.03651: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204187.03654: variable 'ansible_pipelining' from source: unknown 19665 1727204187.03656: variable 'ansible_timeout' from source: unknown 19665 1727204187.03662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204187.03760: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204187.03772: variable 'omit' from source: magic vars 19665 1727204187.03777: starting attempt loop 19665 1727204187.03779: running the handler 19665 1727204187.03871: variable '__network_connections_result' from source: set_fact 19665 1727204187.03911: handler run complete 19665 1727204187.03923: attempt loop complete, returning result 19665 1727204187.03926: _execute() done 19665 1727204187.03929: dumping result to json 19665 1727204187.03931: done dumping result, returning 19665 1727204187.03940: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [0affcd87-79f5-0dcc-3ea6-00000000006c] 19665 1727204187.03947: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000006c 19665 1727204187.04037: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000006c 19665 1727204187.04039: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 19665 1727204187.04096: no more pending results, returning what we have 19665 1727204187.04099: results queue empty 19665 1727204187.04100: checking for any_errors_fatal 19665 1727204187.04113: done checking for any_errors_fatal 19665 1727204187.04114: checking for max_fail_percentage 19665 1727204187.04116: done checking for max_fail_percentage 19665 1727204187.04117: checking to see if all hosts have failed and the running result is not ok 19665 1727204187.04118: done checking to see if all hosts have failed 19665 1727204187.04118: getting the remaining hosts for this loop 19665 1727204187.04120: done getting the remaining hosts for this loop 19665 1727204187.04124: getting the next task for host managed-node3 19665 1727204187.04131: done getting next task for host managed-node3 19665 1727204187.04135: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19665 1727204187.04136: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204187.04146: getting variables 19665 1727204187.04147: in VariableManager get_vars() 19665 1727204187.04214: Calling all_inventory to load vars for managed-node3 19665 1727204187.04223: Calling groups_inventory to load vars for managed-node3 19665 1727204187.04226: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204187.04236: Calling all_plugins_play to load vars for managed-node3 19665 1727204187.04239: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204187.04241: Calling groups_plugins_play to load vars for managed-node3 19665 1727204187.05745: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204187.06659: done with get_vars() 19665 1727204187.06677: done getting variables 19665 1727204187.06722: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.045) 0:00:37.934 ***** 19665 1727204187.06747: entering _queue_task() for managed-node3/debug 19665 1727204187.06977: worker is 1 (out of 1 available) 19665 1727204187.06992: exiting _queue_task() for managed-node3/debug 19665 1727204187.07005: done queuing things up, now waiting for results queue to drain 19665 1727204187.07006: waiting for pending results... 19665 1727204187.07196: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 19665 1727204187.07275: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000006d 19665 1727204187.07286: variable 'ansible_search_path' from source: unknown 19665 1727204187.07289: variable 'ansible_search_path' from source: unknown 19665 1727204187.07319: calling self._execute() 19665 1727204187.07396: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204187.07400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204187.07408: variable 'omit' from source: magic vars 19665 1727204187.07917: variable 'ansible_distribution_major_version' from source: facts 19665 1727204187.07966: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204187.07986: variable 'omit' from source: magic vars 19665 1727204187.08042: variable 'omit' from source: magic vars 19665 1727204187.08100: variable 'omit' from source: magic vars 19665 1727204187.08173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204187.08217: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204187.08234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204187.08248: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204187.08272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204187.08362: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204187.08388: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204187.08397: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204187.08476: Set connection var ansible_connection to ssh 19665 1727204187.08492: Set connection var ansible_shell_type to sh 19665 1727204187.08495: Set connection var ansible_timeout to 10 19665 1727204187.08497: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204187.08507: Set connection var ansible_shell_executable to /bin/sh 19665 1727204187.08518: Set connection var ansible_pipelining to False 19665 1727204187.08550: variable 'ansible_shell_executable' from source: unknown 19665 1727204187.08560: variable 'ansible_connection' from source: unknown 19665 1727204187.08566: variable 'ansible_module_compression' from source: unknown 19665 1727204187.08568: variable 'ansible_shell_type' from source: unknown 19665 1727204187.08571: variable 'ansible_shell_executable' from source: unknown 19665 1727204187.08573: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204187.08575: variable 'ansible_pipelining' from source: unknown 19665 1727204187.08582: variable 'ansible_timeout' from source: unknown 19665 1727204187.08585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204187.08690: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204187.08700: variable 'omit' from source: magic vars 19665 1727204187.08708: starting attempt loop 19665 1727204187.08711: running the handler 19665 1727204187.08747: variable '__network_connections_result' from source: set_fact 19665 1727204187.08803: variable '__network_connections_result' from source: set_fact 19665 1727204187.08894: handler run complete 19665 1727204187.08944: attempt loop complete, returning result 19665 1727204187.08947: _execute() done 19665 1727204187.08951: dumping result to json 19665 1727204187.08962: done dumping result, returning 19665 1727204187.08977: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [0affcd87-79f5-0dcc-3ea6-00000000006d] 19665 1727204187.08986: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000006d 19665 1727204187.09119: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000006d ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "LSR-TST-br31", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 19665 1727204187.09222: WORKER PROCESS EXITING 19665 1727204187.09241: no more pending results, returning what we have 19665 1727204187.09244: results queue empty 19665 1727204187.09245: checking for any_errors_fatal 19665 1727204187.09253: done checking for any_errors_fatal 19665 1727204187.09254: checking for max_fail_percentage 19665 1727204187.09255: done checking for max_fail_percentage 19665 1727204187.09256: checking to see if all hosts have failed and the running result is not ok 19665 1727204187.09257: done checking to see if all hosts have failed 19665 1727204187.09258: getting the remaining hosts for this loop 19665 1727204187.09259: done getting the remaining hosts for this loop 19665 1727204187.09265: getting the next task for host managed-node3 19665 1727204187.09271: done getting next task for host managed-node3 19665 1727204187.09275: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19665 1727204187.09276: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204187.09291: getting variables 19665 1727204187.09293: in VariableManager get_vars() 19665 1727204187.09336: Calling all_inventory to load vars for managed-node3 19665 1727204187.09339: Calling groups_inventory to load vars for managed-node3 19665 1727204187.09341: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204187.09358: Calling all_plugins_play to load vars for managed-node3 19665 1727204187.09362: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204187.09369: Calling groups_plugins_play to load vars for managed-node3 19665 1727204187.10921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204187.12199: done with get_vars() 19665 1727204187.12249: done getting variables 19665 1727204187.12307: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.055) 0:00:37.989 ***** 19665 1727204187.12337: entering _queue_task() for managed-node3/debug 19665 1727204187.12582: worker is 1 (out of 1 available) 19665 1727204187.12596: exiting _queue_task() for managed-node3/debug 19665 1727204187.12610: done queuing things up, now waiting for results queue to drain 19665 1727204187.12611: waiting for pending results... 19665 1727204187.12804: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 19665 1727204187.12886: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000006e 19665 1727204187.12897: variable 'ansible_search_path' from source: unknown 19665 1727204187.12901: variable 'ansible_search_path' from source: unknown 19665 1727204187.12930: calling self._execute() 19665 1727204187.13010: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204187.13014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204187.13023: variable 'omit' from source: magic vars 19665 1727204187.13330: variable 'ansible_distribution_major_version' from source: facts 19665 1727204187.13340: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204187.13430: variable 'network_state' from source: role '' defaults 19665 1727204187.13439: Evaluated conditional (network_state != {}): False 19665 1727204187.13442: when evaluation is False, skipping this task 19665 1727204187.13447: _execute() done 19665 1727204187.13450: dumping result to json 19665 1727204187.13452: done dumping result, returning 19665 1727204187.13461: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [0affcd87-79f5-0dcc-3ea6-00000000006e] 19665 1727204187.13473: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000006e 19665 1727204187.13576: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000006e 19665 1727204187.13579: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 19665 1727204187.13625: no more pending results, returning what we have 19665 1727204187.13630: results queue empty 19665 1727204187.13631: checking for any_errors_fatal 19665 1727204187.13639: done checking for any_errors_fatal 19665 1727204187.13640: checking for max_fail_percentage 19665 1727204187.13642: done checking for max_fail_percentage 19665 1727204187.13642: checking to see if all hosts have failed and the running result is not ok 19665 1727204187.13643: done checking to see if all hosts have failed 19665 1727204187.13644: getting the remaining hosts for this loop 19665 1727204187.13646: done getting the remaining hosts for this loop 19665 1727204187.13650: getting the next task for host managed-node3 19665 1727204187.13655: done getting next task for host managed-node3 19665 1727204187.13659: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 19665 1727204187.13661: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204187.13682: getting variables 19665 1727204187.13684: in VariableManager get_vars() 19665 1727204187.13726: Calling all_inventory to load vars for managed-node3 19665 1727204187.13728: Calling groups_inventory to load vars for managed-node3 19665 1727204187.13730: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204187.13740: Calling all_plugins_play to load vars for managed-node3 19665 1727204187.13743: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204187.13745: Calling groups_plugins_play to load vars for managed-node3 19665 1727204187.14950: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204187.16399: done with get_vars() 19665 1727204187.16424: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.041) 0:00:38.031 ***** 19665 1727204187.16507: entering _queue_task() for managed-node3/ping 19665 1727204187.16843: worker is 1 (out of 1 available) 19665 1727204187.16858: exiting _queue_task() for managed-node3/ping 19665 1727204187.16874: done queuing things up, now waiting for results queue to drain 19665 1727204187.16876: waiting for pending results... 19665 1727204187.17126: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 19665 1727204187.17209: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000006f 19665 1727204187.17220: variable 'ansible_search_path' from source: unknown 19665 1727204187.17223: variable 'ansible_search_path' from source: unknown 19665 1727204187.17257: calling self._execute() 19665 1727204187.17332: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204187.17336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204187.17349: variable 'omit' from source: magic vars 19665 1727204187.17657: variable 'ansible_distribution_major_version' from source: facts 19665 1727204187.17670: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204187.17675: variable 'omit' from source: magic vars 19665 1727204187.17705: variable 'omit' from source: magic vars 19665 1727204187.17732: variable 'omit' from source: magic vars 19665 1727204187.17768: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204187.17795: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204187.17812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204187.17828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204187.17837: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204187.17865: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204187.17868: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204187.17870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204187.17937: Set connection var ansible_connection to ssh 19665 1727204187.18021: Set connection var ansible_shell_type to sh 19665 1727204187.18024: Set connection var ansible_timeout to 10 19665 1727204187.18027: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204187.18029: Set connection var ansible_shell_executable to /bin/sh 19665 1727204187.18032: Set connection var ansible_pipelining to False 19665 1727204187.18034: variable 'ansible_shell_executable' from source: unknown 19665 1727204187.18037: variable 'ansible_connection' from source: unknown 19665 1727204187.18041: variable 'ansible_module_compression' from source: unknown 19665 1727204187.18043: variable 'ansible_shell_type' from source: unknown 19665 1727204187.18046: variable 'ansible_shell_executable' from source: unknown 19665 1727204187.18086: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204187.18089: variable 'ansible_pipelining' from source: unknown 19665 1727204187.18091: variable 'ansible_timeout' from source: unknown 19665 1727204187.18093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204187.18286: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204187.18293: variable 'omit' from source: magic vars 19665 1727204187.18300: starting attempt loop 19665 1727204187.18303: running the handler 19665 1727204187.18313: _low_level_execute_command(): starting 19665 1727204187.18319: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204187.18902: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.18952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.18970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.19010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204187.19022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204187.19081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204187.20700: stdout chunk (state=3): >>>/root <<< 19665 1727204187.20798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204187.20870: stderr chunk (state=3): >>><<< 19665 1727204187.20898: stdout chunk (state=3): >>><<< 19665 1727204187.20916: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204187.20930: _low_level_execute_command(): starting 19665 1727204187.20936: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204187.2091613-22687-250141419262002 `" && echo ansible-tmp-1727204187.2091613-22687-250141419262002="` echo /root/.ansible/tmp/ansible-tmp-1727204187.2091613-22687-250141419262002 `" ) && sleep 0' 19665 1727204187.21604: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.21608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.21643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204187.21647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.21657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204187.21661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.21719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204187.21726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204187.21769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204187.23568: stdout chunk (state=3): >>>ansible-tmp-1727204187.2091613-22687-250141419262002=/root/.ansible/tmp/ansible-tmp-1727204187.2091613-22687-250141419262002 <<< 19665 1727204187.23686: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204187.23793: stderr chunk (state=3): >>><<< 19665 1727204187.23797: stdout chunk (state=3): >>><<< 19665 1727204187.23814: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204187.2091613-22687-250141419262002=/root/.ansible/tmp/ansible-tmp-1727204187.2091613-22687-250141419262002 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204187.23874: variable 'ansible_module_compression' from source: unknown 19665 1727204187.23916: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 19665 1727204187.23963: variable 'ansible_facts' from source: unknown 19665 1727204187.24030: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204187.2091613-22687-250141419262002/AnsiballZ_ping.py 19665 1727204187.24147: Sending initial data 19665 1727204187.24150: Sent initial data (153 bytes) 19665 1727204187.24923: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204187.24950: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.24954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.24982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.25008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204187.25013: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204187.25022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.25050: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204187.25057: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204187.25064: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204187.25069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.25104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204187.25113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.25225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204187.25253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204187.26906: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204187.26942: stderr chunk (state=3): >>>debug1: Using server download size 261120 <<< 19665 1727204187.26949: stderr chunk (state=3): >>>debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204187.26993: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpko3n1yiu /root/.ansible/tmp/ansible-tmp-1727204187.2091613-22687-250141419262002/AnsiballZ_ping.py <<< 19665 1727204187.27036: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204187.27965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204187.28106: stderr chunk (state=3): >>><<< 19665 1727204187.28109: stdout chunk (state=3): >>><<< 19665 1727204187.28126: done transferring module to remote 19665 1727204187.28137: _low_level_execute_command(): starting 19665 1727204187.28143: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204187.2091613-22687-250141419262002/ /root/.ansible/tmp/ansible-tmp-1727204187.2091613-22687-250141419262002/AnsiballZ_ping.py && sleep 0' 19665 1727204187.28713: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204187.28719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.28734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.28780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204187.28785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.28788: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204187.28791: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.28838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204187.28853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204187.28892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204187.30612: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204187.30683: stderr chunk (state=3): >>><<< 19665 1727204187.30694: stdout chunk (state=3): >>><<< 19665 1727204187.30724: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204187.30734: _low_level_execute_command(): starting 19665 1727204187.30742: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204187.2091613-22687-250141419262002/AnsiballZ_ping.py && sleep 0' 19665 1727204187.31297: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.31304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.31342: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204187.31353: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204187.31361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.31388: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 19665 1727204187.31391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204187.31393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.31444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204187.31448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204187.31501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204187.44482: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 19665 1727204187.45533: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204187.45538: stdout chunk (state=3): >>><<< 19665 1727204187.45540: stderr chunk (state=3): >>><<< 19665 1727204187.45670: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204187.45675: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204187.2091613-22687-250141419262002/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204187.45678: _low_level_execute_command(): starting 19665 1727204187.45680: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204187.2091613-22687-250141419262002/ > /dev/null 2>&1 && sleep 0' 19665 1727204187.46251: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204187.46268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.46283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.46301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.46344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204187.46356: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204187.46374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.46391: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204187.46403: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204187.46415: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204187.46430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.46444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.46461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.46477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204187.46489: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204187.46503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.46587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204187.46610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204187.46626: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204187.46698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204187.48538: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204187.48542: stdout chunk (state=3): >>><<< 19665 1727204187.48544: stderr chunk (state=3): >>><<< 19665 1727204187.48939: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204187.48943: handler run complete 19665 1727204187.48946: attempt loop complete, returning result 19665 1727204187.48948: _execute() done 19665 1727204187.48951: dumping result to json 19665 1727204187.48953: done dumping result, returning 19665 1727204187.48955: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [0affcd87-79f5-0dcc-3ea6-00000000006f] 19665 1727204187.48957: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000006f 19665 1727204187.49024: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000006f 19665 1727204187.49028: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 19665 1727204187.49091: no more pending results, returning what we have 19665 1727204187.49094: results queue empty 19665 1727204187.49095: checking for any_errors_fatal 19665 1727204187.49101: done checking for any_errors_fatal 19665 1727204187.49101: checking for max_fail_percentage 19665 1727204187.49103: done checking for max_fail_percentage 19665 1727204187.49104: checking to see if all hosts have failed and the running result is not ok 19665 1727204187.49105: done checking to see if all hosts have failed 19665 1727204187.49106: getting the remaining hosts for this loop 19665 1727204187.49107: done getting the remaining hosts for this loop 19665 1727204187.49111: getting the next task for host managed-node3 19665 1727204187.49118: done getting next task for host managed-node3 19665 1727204187.49121: ^ task is: TASK: meta (role_complete) 19665 1727204187.49122: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204187.49132: getting variables 19665 1727204187.49134: in VariableManager get_vars() 19665 1727204187.49182: Calling all_inventory to load vars for managed-node3 19665 1727204187.49185: Calling groups_inventory to load vars for managed-node3 19665 1727204187.49188: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204187.49197: Calling all_plugins_play to load vars for managed-node3 19665 1727204187.49201: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204187.49204: Calling groups_plugins_play to load vars for managed-node3 19665 1727204187.51069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204187.52937: done with get_vars() 19665 1727204187.52961: done getting variables 19665 1727204187.53045: done queuing things up, now waiting for results queue to drain 19665 1727204187.53047: results queue empty 19665 1727204187.53048: checking for any_errors_fatal 19665 1727204187.53051: done checking for any_errors_fatal 19665 1727204187.53051: checking for max_fail_percentage 19665 1727204187.53052: done checking for max_fail_percentage 19665 1727204187.53053: checking to see if all hosts have failed and the running result is not ok 19665 1727204187.53054: done checking to see if all hosts have failed 19665 1727204187.53055: getting the remaining hosts for this loop 19665 1727204187.53056: done getting the remaining hosts for this loop 19665 1727204187.53058: getting the next task for host managed-node3 19665 1727204187.53062: done getting next task for host managed-node3 19665 1727204187.53065: ^ task is: TASK: meta (flush_handlers) 19665 1727204187.53067: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204187.53070: getting variables 19665 1727204187.53071: in VariableManager get_vars() 19665 1727204187.53084: Calling all_inventory to load vars for managed-node3 19665 1727204187.53086: Calling groups_inventory to load vars for managed-node3 19665 1727204187.53088: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204187.53093: Calling all_plugins_play to load vars for managed-node3 19665 1727204187.53095: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204187.53098: Calling groups_plugins_play to load vars for managed-node3 19665 1727204187.54454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204187.56931: done with get_vars() 19665 1727204187.56963: done getting variables 19665 1727204187.57019: in VariableManager get_vars() 19665 1727204187.57034: Calling all_inventory to load vars for managed-node3 19665 1727204187.57037: Calling groups_inventory to load vars for managed-node3 19665 1727204187.57039: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204187.57044: Calling all_plugins_play to load vars for managed-node3 19665 1727204187.57047: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204187.57050: Calling groups_plugins_play to load vars for managed-node3 19665 1727204187.59101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204187.60724: done with get_vars() 19665 1727204187.60761: done queuing things up, now waiting for results queue to drain 19665 1727204187.60765: results queue empty 19665 1727204187.60766: checking for any_errors_fatal 19665 1727204187.60767: done checking for any_errors_fatal 19665 1727204187.60768: checking for max_fail_percentage 19665 1727204187.60769: done checking for max_fail_percentage 19665 1727204187.60770: checking to see if all hosts have failed and the running result is not ok 19665 1727204187.60771: done checking to see if all hosts have failed 19665 1727204187.60772: getting the remaining hosts for this loop 19665 1727204187.60773: done getting the remaining hosts for this loop 19665 1727204187.60776: getting the next task for host managed-node3 19665 1727204187.60780: done getting next task for host managed-node3 19665 1727204187.60782: ^ task is: TASK: meta (flush_handlers) 19665 1727204187.60788: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204187.60791: getting variables 19665 1727204187.60792: in VariableManager get_vars() 19665 1727204187.60806: Calling all_inventory to load vars for managed-node3 19665 1727204187.60808: Calling groups_inventory to load vars for managed-node3 19665 1727204187.60810: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204187.60815: Calling all_plugins_play to load vars for managed-node3 19665 1727204187.60819: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204187.60821: Calling groups_plugins_play to load vars for managed-node3 19665 1727204187.62036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204187.64428: done with get_vars() 19665 1727204187.64459: done getting variables 19665 1727204187.64518: in VariableManager get_vars() 19665 1727204187.64534: Calling all_inventory to load vars for managed-node3 19665 1727204187.64536: Calling groups_inventory to load vars for managed-node3 19665 1727204187.64538: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204187.64544: Calling all_plugins_play to load vars for managed-node3 19665 1727204187.64546: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204187.64550: Calling groups_plugins_play to load vars for managed-node3 19665 1727204187.65861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204187.68437: done with get_vars() 19665 1727204187.68479: done queuing things up, now waiting for results queue to drain 19665 1727204187.68482: results queue empty 19665 1727204187.68483: checking for any_errors_fatal 19665 1727204187.68485: done checking for any_errors_fatal 19665 1727204187.68485: checking for max_fail_percentage 19665 1727204187.68487: done checking for max_fail_percentage 19665 1727204187.68488: checking to see if all hosts have failed and the running result is not ok 19665 1727204187.68488: done checking to see if all hosts have failed 19665 1727204187.68489: getting the remaining hosts for this loop 19665 1727204187.68490: done getting the remaining hosts for this loop 19665 1727204187.68493: getting the next task for host managed-node3 19665 1727204187.68497: done getting next task for host managed-node3 19665 1727204187.68498: ^ task is: None 19665 1727204187.68500: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204187.68501: done queuing things up, now waiting for results queue to drain 19665 1727204187.68502: results queue empty 19665 1727204187.68503: checking for any_errors_fatal 19665 1727204187.68504: done checking for any_errors_fatal 19665 1727204187.68504: checking for max_fail_percentage 19665 1727204187.68505: done checking for max_fail_percentage 19665 1727204187.68506: checking to see if all hosts have failed and the running result is not ok 19665 1727204187.68507: done checking to see if all hosts have failed 19665 1727204187.68508: getting the next task for host managed-node3 19665 1727204187.68511: done getting next task for host managed-node3 19665 1727204187.68512: ^ task is: None 19665 1727204187.68513: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204187.68565: in VariableManager get_vars() 19665 1727204187.68585: done with get_vars() 19665 1727204187.68591: in VariableManager get_vars() 19665 1727204187.68601: done with get_vars() 19665 1727204187.68605: variable 'omit' from source: magic vars 19665 1727204187.68736: variable 'task' from source: play vars 19665 1727204187.68774: in VariableManager get_vars() 19665 1727204187.68786: done with get_vars() 19665 1727204187.68808: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_profile_absent.yml] ************************ 19665 1727204187.69003: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19665 1727204187.69028: getting the remaining hosts for this loop 19665 1727204187.69029: done getting the remaining hosts for this loop 19665 1727204187.69032: getting the next task for host managed-node3 19665 1727204187.69034: done getting next task for host managed-node3 19665 1727204187.69036: ^ task is: TASK: Gathering Facts 19665 1727204187.69037: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204187.69039: getting variables 19665 1727204187.69040: in VariableManager get_vars() 19665 1727204187.69048: Calling all_inventory to load vars for managed-node3 19665 1727204187.69050: Calling groups_inventory to load vars for managed-node3 19665 1727204187.69052: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204187.69058: Calling all_plugins_play to load vars for managed-node3 19665 1727204187.69060: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204187.69063: Calling groups_plugins_play to load vars for managed-node3 19665 1727204187.70393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204187.72061: done with get_vars() 19665 1727204187.72088: done getting variables 19665 1727204187.72137: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:56:27 -0400 (0:00:00.556) 0:00:38.588 ***** 19665 1727204187.72166: entering _queue_task() for managed-node3/gather_facts 19665 1727204187.72796: worker is 1 (out of 1 available) 19665 1727204187.72810: exiting _queue_task() for managed-node3/gather_facts 19665 1727204187.72821: done queuing things up, now waiting for results queue to drain 19665 1727204187.72822: waiting for pending results... 19665 1727204187.73526: running TaskExecutor() for managed-node3/TASK: Gathering Facts 19665 1727204187.73656: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000046e 19665 1727204187.73684: variable 'ansible_search_path' from source: unknown 19665 1727204187.73730: calling self._execute() 19665 1727204187.73840: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204187.73853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204187.73872: variable 'omit' from source: magic vars 19665 1727204187.74268: variable 'ansible_distribution_major_version' from source: facts 19665 1727204187.74289: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204187.74306: variable 'omit' from source: magic vars 19665 1727204187.74342: variable 'omit' from source: magic vars 19665 1727204187.74386: variable 'omit' from source: magic vars 19665 1727204187.74444: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204187.74491: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204187.74528: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204187.74556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204187.74577: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204187.74614: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204187.74629: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204187.74636: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204187.74739: Set connection var ansible_connection to ssh 19665 1727204187.74751: Set connection var ansible_shell_type to sh 19665 1727204187.74761: Set connection var ansible_timeout to 10 19665 1727204187.74773: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204187.74784: Set connection var ansible_shell_executable to /bin/sh 19665 1727204187.74795: Set connection var ansible_pipelining to False 19665 1727204187.74822: variable 'ansible_shell_executable' from source: unknown 19665 1727204187.74832: variable 'ansible_connection' from source: unknown 19665 1727204187.74843: variable 'ansible_module_compression' from source: unknown 19665 1727204187.74849: variable 'ansible_shell_type' from source: unknown 19665 1727204187.74855: variable 'ansible_shell_executable' from source: unknown 19665 1727204187.74861: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204187.74869: variable 'ansible_pipelining' from source: unknown 19665 1727204187.74875: variable 'ansible_timeout' from source: unknown 19665 1727204187.74882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204187.75065: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204187.75085: variable 'omit' from source: magic vars 19665 1727204187.75094: starting attempt loop 19665 1727204187.75100: running the handler 19665 1727204187.75129: variable 'ansible_facts' from source: unknown 19665 1727204187.75158: _low_level_execute_command(): starting 19665 1727204187.75178: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204187.75979: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204187.75994: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.76011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.76034: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.76082: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204187.76094: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204187.76111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.76133: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204187.76149: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204187.76159: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204187.76173: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.76186: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.76201: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.76212: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204187.76224: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204187.76241: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.76325: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204187.76349: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204187.76376: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204187.76586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204187.78057: stdout chunk (state=3): >>>/root <<< 19665 1727204187.78262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204187.78268: stdout chunk (state=3): >>><<< 19665 1727204187.78271: stderr chunk (state=3): >>><<< 19665 1727204187.78399: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204187.78403: _low_level_execute_command(): starting 19665 1727204187.78406: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204187.782953-22713-218801098188659 `" && echo ansible-tmp-1727204187.782953-22713-218801098188659="` echo /root/.ansible/tmp/ansible-tmp-1727204187.782953-22713-218801098188659 `" ) && sleep 0' 19665 1727204187.79047: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204187.79071: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.79087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.79105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.79152: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204187.79174: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204187.79190: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.79209: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204187.79222: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204187.79234: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204187.79251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.79267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.79290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.79303: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204187.79315: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204187.79328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.79637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204187.79665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204187.79689: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204187.79769: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204187.81583: stdout chunk (state=3): >>>ansible-tmp-1727204187.782953-22713-218801098188659=/root/.ansible/tmp/ansible-tmp-1727204187.782953-22713-218801098188659 <<< 19665 1727204187.81793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204187.81797: stdout chunk (state=3): >>><<< 19665 1727204187.81799: stderr chunk (state=3): >>><<< 19665 1727204187.81971: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204187.782953-22713-218801098188659=/root/.ansible/tmp/ansible-tmp-1727204187.782953-22713-218801098188659 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204187.81974: variable 'ansible_module_compression' from source: unknown 19665 1727204187.81977: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19665 1727204187.82175: variable 'ansible_facts' from source: unknown 19665 1727204187.82178: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204187.782953-22713-218801098188659/AnsiballZ_setup.py 19665 1727204187.82339: Sending initial data 19665 1727204187.82345: Sent initial data (153 bytes) 19665 1727204187.83335: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204187.83354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.83373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.83392: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.83436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204187.83453: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204187.83470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.83489: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204187.83501: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204187.83513: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204187.83526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.83543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.83561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.83579: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204187.83592: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204187.83607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.83686: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204187.83703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204187.83718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204187.83802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204187.85492: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204187.85531: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204187.85594: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpi73lqrj0 /root/.ansible/tmp/ansible-tmp-1727204187.782953-22713-218801098188659/AnsiballZ_setup.py <<< 19665 1727204187.85612: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204187.88027: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204187.88248: stderr chunk (state=3): >>><<< 19665 1727204187.88252: stdout chunk (state=3): >>><<< 19665 1727204187.88254: done transferring module to remote 19665 1727204187.88256: _low_level_execute_command(): starting 19665 1727204187.88265: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204187.782953-22713-218801098188659/ /root/.ansible/tmp/ansible-tmp-1727204187.782953-22713-218801098188659/AnsiballZ_setup.py && sleep 0' 19665 1727204187.88867: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204187.88884: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.88898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.88916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.88965: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204187.88979: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204187.88993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.89010: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204187.89021: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204187.89031: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204187.89045: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204187.89058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.89076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.89089: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204187.89102: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204187.89116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.89196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204187.89212: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204187.89227: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204187.89309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204187.90982: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204187.91052: stderr chunk (state=3): >>><<< 19665 1727204187.91054: stdout chunk (state=3): >>><<< 19665 1727204187.91070: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204187.91073: _low_level_execute_command(): starting 19665 1727204187.91078: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204187.782953-22713-218801098188659/AnsiballZ_setup.py && sleep 0' 19665 1727204187.91542: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204187.91583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.91587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204187.91590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204187.91637: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204187.91656: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204187.91704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204188.41787: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_n<<< 19665 1727204188.41797: stdout chunk (state=3): >>>odename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2803, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 729, "free": 2803}, "nocache": {"free": 3262, "used": 270}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 534, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282144768, "block_size": 4096, "block_total": 65519355, "block_available": 64522008, "block_used": 997347, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-a<<< 19665 1727204188.41827: stdout chunk (state=3): >>>ab4-4a6a-aa73-3e870a6316ae"}], "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.37, "5m": 0.34, "15m": 0.17}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "28", "epoch": "1727204188", "epoch_int": "1727204188", "date": "2024-09-24", "time": "14:56:28", "iso8601_micro": "2024-09-24T18:56:28.377390Z", "iso8601": "2024-09-24T18:56:28Z", "iso8601_basic": "20240924T145628377390", "iso8601_basic_short": "20240924T145628", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19665 1727204188.43489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204188.43582: stderr chunk (state=3): >>><<< 19665 1727204188.43586: stdout chunk (state=3): >>><<< 19665 1727204188.43798: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2803, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 729, "free": 2803}, "nocache": {"free": 3262, "used": 270}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 534, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282144768, "block_size": 4096, "block_total": 65519355, "block_available": 64522008, "block_used": 997347, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_loadavg": {"1m": 0.37, "5m": 0.34, "15m": 0.17}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fips": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "28", "epoch": "1727204188", "epoch_int": "1727204188", "date": "2024-09-24", "time": "14:56:28", "iso8601_micro": "2024-09-24T18:56:28.377390Z", "iso8601": "2024-09-24T18:56:28Z", "iso8601_basic": "20240924T145628377390", "iso8601_basic_short": "20240924T145628", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204188.44103: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204187.782953-22713-218801098188659/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204188.44130: _low_level_execute_command(): starting 19665 1727204188.44143: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204187.782953-22713-218801098188659/ > /dev/null 2>&1 && sleep 0' 19665 1727204188.46261: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204188.46503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204188.46520: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204188.46544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204188.46588: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204188.46606: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204188.46621: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204188.46642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204188.46725: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204188.46736: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204188.46751: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204188.46767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204188.46784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204188.46795: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204188.46806: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204188.46820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204188.46901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204188.47171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204188.47189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204188.47383: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204188.49242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204188.49246: stdout chunk (state=3): >>><<< 19665 1727204188.49248: stderr chunk (state=3): >>><<< 19665 1727204188.49562: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204188.49568: handler run complete 19665 1727204188.49571: variable 'ansible_facts' from source: unknown 19665 1727204188.49573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204188.49899: variable 'ansible_facts' from source: unknown 19665 1727204188.50000: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204188.50136: attempt loop complete, returning result 19665 1727204188.50149: _execute() done 19665 1727204188.50157: dumping result to json 19665 1727204188.50193: done dumping result, returning 19665 1727204188.50228: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-0dcc-3ea6-00000000046e] 19665 1727204188.50274: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000046e ok: [managed-node3] 19665 1727204188.50885: no more pending results, returning what we have 19665 1727204188.50889: results queue empty 19665 1727204188.50890: checking for any_errors_fatal 19665 1727204188.50892: done checking for any_errors_fatal 19665 1727204188.50892: checking for max_fail_percentage 19665 1727204188.50895: done checking for max_fail_percentage 19665 1727204188.50896: checking to see if all hosts have failed and the running result is not ok 19665 1727204188.50897: done checking to see if all hosts have failed 19665 1727204188.50897: getting the remaining hosts for this loop 19665 1727204188.50899: done getting the remaining hosts for this loop 19665 1727204188.50903: getting the next task for host managed-node3 19665 1727204188.50910: done getting next task for host managed-node3 19665 1727204188.50912: ^ task is: TASK: meta (flush_handlers) 19665 1727204188.50914: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204188.50918: getting variables 19665 1727204188.50920: in VariableManager get_vars() 19665 1727204188.50948: Calling all_inventory to load vars for managed-node3 19665 1727204188.50951: Calling groups_inventory to load vars for managed-node3 19665 1727204188.50955: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204188.50968: Calling all_plugins_play to load vars for managed-node3 19665 1727204188.50972: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204188.50976: Calling groups_plugins_play to load vars for managed-node3 19665 1727204188.52134: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000046e 19665 1727204188.52138: WORKER PROCESS EXITING 19665 1727204188.53769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204188.57259: done with get_vars() 19665 1727204188.57299: done getting variables 19665 1727204188.57686: in VariableManager get_vars() 19665 1727204188.57700: Calling all_inventory to load vars for managed-node3 19665 1727204188.57702: Calling groups_inventory to load vars for managed-node3 19665 1727204188.57705: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204188.57710: Calling all_plugins_play to load vars for managed-node3 19665 1727204188.57713: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204188.57721: Calling groups_plugins_play to load vars for managed-node3 19665 1727204188.60339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204188.63523: done with get_vars() 19665 1727204188.63561: done queuing things up, now waiting for results queue to drain 19665 1727204188.63565: results queue empty 19665 1727204188.63566: checking for any_errors_fatal 19665 1727204188.63570: done checking for any_errors_fatal 19665 1727204188.63571: checking for max_fail_percentage 19665 1727204188.63572: done checking for max_fail_percentage 19665 1727204188.63573: checking to see if all hosts have failed and the running result is not ok 19665 1727204188.63574: done checking to see if all hosts have failed 19665 1727204188.63574: getting the remaining hosts for this loop 19665 1727204188.63575: done getting the remaining hosts for this loop 19665 1727204188.63578: getting the next task for host managed-node3 19665 1727204188.63583: done getting next task for host managed-node3 19665 1727204188.63585: ^ task is: TASK: Include the task '{{ task }}' 19665 1727204188.63587: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204188.63589: getting variables 19665 1727204188.63590: in VariableManager get_vars() 19665 1727204188.63599: Calling all_inventory to load vars for managed-node3 19665 1727204188.63601: Calling groups_inventory to load vars for managed-node3 19665 1727204188.63604: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204188.63609: Calling all_plugins_play to load vars for managed-node3 19665 1727204188.63612: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204188.63614: Calling groups_plugins_play to load vars for managed-node3 19665 1727204188.66378: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204188.70082: done with get_vars() 19665 1727204188.70106: done getting variables 19665 1727204188.70262: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_profile_absent.yml'] ********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:56:28 -0400 (0:00:00.981) 0:00:39.569 ***** 19665 1727204188.70293: entering _queue_task() for managed-node3/include_tasks 19665 1727204188.71226: worker is 1 (out of 1 available) 19665 1727204188.71239: exiting _queue_task() for managed-node3/include_tasks 19665 1727204188.71254: done queuing things up, now waiting for results queue to drain 19665 1727204188.71256: waiting for pending results... 19665 1727204188.72219: running TaskExecutor() for managed-node3/TASK: Include the task 'tasks/assert_profile_absent.yml' 19665 1727204188.72513: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000073 19665 1727204188.72601: variable 'ansible_search_path' from source: unknown 19665 1727204188.72647: calling self._execute() 19665 1727204188.72781: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204188.72797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204188.72817: variable 'omit' from source: magic vars 19665 1727204188.73254: variable 'ansible_distribution_major_version' from source: facts 19665 1727204188.73277: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204188.73292: variable 'task' from source: play vars 19665 1727204188.73362: variable 'task' from source: play vars 19665 1727204188.73378: _execute() done 19665 1727204188.73388: dumping result to json 19665 1727204188.73398: done dumping result, returning 19665 1727204188.73410: done running TaskExecutor() for managed-node3/TASK: Include the task 'tasks/assert_profile_absent.yml' [0affcd87-79f5-0dcc-3ea6-000000000073] 19665 1727204188.73422: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000073 19665 1727204188.73572: no more pending results, returning what we have 19665 1727204188.73578: in VariableManager get_vars() 19665 1727204188.73611: Calling all_inventory to load vars for managed-node3 19665 1727204188.73614: Calling groups_inventory to load vars for managed-node3 19665 1727204188.73617: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204188.73633: Calling all_plugins_play to load vars for managed-node3 19665 1727204188.73636: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204188.73642: Calling groups_plugins_play to load vars for managed-node3 19665 1727204188.74271: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000073 19665 1727204188.74275: WORKER PROCESS EXITING 19665 1727204188.75704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204188.77520: done with get_vars() 19665 1727204188.77544: variable 'ansible_search_path' from source: unknown 19665 1727204188.77558: we have included files to process 19665 1727204188.77559: generating all_blocks data 19665 1727204188.77560: done generating all_blocks data 19665 1727204188.77561: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 19665 1727204188.77562: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 19665 1727204188.77869: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 19665 1727204188.78027: in VariableManager get_vars() 19665 1727204188.78046: done with get_vars() 19665 1727204188.78155: done processing included file 19665 1727204188.78156: iterating over new_blocks loaded from include file 19665 1727204188.78158: in VariableManager get_vars() 19665 1727204188.78273: done with get_vars() 19665 1727204188.78274: filtering new block on tags 19665 1727204188.78292: done filtering new block on tags 19665 1727204188.78295: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node3 19665 1727204188.78299: extending task lists for all hosts with included blocks 19665 1727204188.78328: done extending task lists 19665 1727204188.78329: done processing included files 19665 1727204188.78330: results queue empty 19665 1727204188.78330: checking for any_errors_fatal 19665 1727204188.78332: done checking for any_errors_fatal 19665 1727204188.78333: checking for max_fail_percentage 19665 1727204188.78334: done checking for max_fail_percentage 19665 1727204188.78334: checking to see if all hosts have failed and the running result is not ok 19665 1727204188.78335: done checking to see if all hosts have failed 19665 1727204188.78336: getting the remaining hosts for this loop 19665 1727204188.78337: done getting the remaining hosts for this loop 19665 1727204188.78341: getting the next task for host managed-node3 19665 1727204188.78345: done getting next task for host managed-node3 19665 1727204188.78347: ^ task is: TASK: Include the task 'get_profile_stat.yml' 19665 1727204188.78350: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204188.78352: getting variables 19665 1727204188.78353: in VariableManager get_vars() 19665 1727204188.78360: Calling all_inventory to load vars for managed-node3 19665 1727204188.78362: Calling groups_inventory to load vars for managed-node3 19665 1727204188.78366: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204188.78371: Calling all_plugins_play to load vars for managed-node3 19665 1727204188.78374: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204188.78376: Calling groups_plugins_play to load vars for managed-node3 19665 1727204188.80010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204188.81926: done with get_vars() 19665 1727204188.81954: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 14:56:28 -0400 (0:00:00.117) 0:00:39.687 ***** 19665 1727204188.82046: entering _queue_task() for managed-node3/include_tasks 19665 1727204188.82762: worker is 1 (out of 1 available) 19665 1727204188.82776: exiting _queue_task() for managed-node3/include_tasks 19665 1727204188.82790: done queuing things up, now waiting for results queue to drain 19665 1727204188.82792: waiting for pending results... 19665 1727204188.83090: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 19665 1727204188.83237: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000047f 19665 1727204188.83268: variable 'ansible_search_path' from source: unknown 19665 1727204188.83276: variable 'ansible_search_path' from source: unknown 19665 1727204188.83314: calling self._execute() 19665 1727204188.83420: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204188.83432: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204188.83454: variable 'omit' from source: magic vars 19665 1727204188.83838: variable 'ansible_distribution_major_version' from source: facts 19665 1727204188.83860: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204188.83873: _execute() done 19665 1727204188.83881: dumping result to json 19665 1727204188.83893: done dumping result, returning 19665 1727204188.83903: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [0affcd87-79f5-0dcc-3ea6-00000000047f] 19665 1727204188.83913: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000047f 19665 1727204188.84042: no more pending results, returning what we have 19665 1727204188.84047: in VariableManager get_vars() 19665 1727204188.84083: Calling all_inventory to load vars for managed-node3 19665 1727204188.84085: Calling groups_inventory to load vars for managed-node3 19665 1727204188.84089: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204188.84103: Calling all_plugins_play to load vars for managed-node3 19665 1727204188.84107: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204188.84111: Calling groups_plugins_play to load vars for managed-node3 19665 1727204188.85384: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000047f 19665 1727204188.85387: WORKER PROCESS EXITING 19665 1727204188.85856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204188.87741: done with get_vars() 19665 1727204188.87763: variable 'ansible_search_path' from source: unknown 19665 1727204188.87766: variable 'ansible_search_path' from source: unknown 19665 1727204188.87775: variable 'task' from source: play vars 19665 1727204188.87894: variable 'task' from source: play vars 19665 1727204188.87955: we have included files to process 19665 1727204188.87956: generating all_blocks data 19665 1727204188.87958: done generating all_blocks data 19665 1727204188.87959: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19665 1727204188.87961: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19665 1727204188.87963: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 19665 1727204188.89580: done processing included file 19665 1727204188.89582: iterating over new_blocks loaded from include file 19665 1727204188.89583: in VariableManager get_vars() 19665 1727204188.89596: done with get_vars() 19665 1727204188.89598: filtering new block on tags 19665 1727204188.89621: done filtering new block on tags 19665 1727204188.89624: in VariableManager get_vars() 19665 1727204188.89634: done with get_vars() 19665 1727204188.89636: filtering new block on tags 19665 1727204188.89660: done filtering new block on tags 19665 1727204188.89663: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 19665 1727204188.89671: extending task lists for all hosts with included blocks 19665 1727204188.89774: done extending task lists 19665 1727204188.89775: done processing included files 19665 1727204188.89776: results queue empty 19665 1727204188.89777: checking for any_errors_fatal 19665 1727204188.89780: done checking for any_errors_fatal 19665 1727204188.89781: checking for max_fail_percentage 19665 1727204188.89782: done checking for max_fail_percentage 19665 1727204188.89783: checking to see if all hosts have failed and the running result is not ok 19665 1727204188.89784: done checking to see if all hosts have failed 19665 1727204188.89785: getting the remaining hosts for this loop 19665 1727204188.89786: done getting the remaining hosts for this loop 19665 1727204188.89788: getting the next task for host managed-node3 19665 1727204188.89793: done getting next task for host managed-node3 19665 1727204188.89795: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 19665 1727204188.89798: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204188.89800: getting variables 19665 1727204188.89801: in VariableManager get_vars() 19665 1727204188.89809: Calling all_inventory to load vars for managed-node3 19665 1727204188.89811: Calling groups_inventory to load vars for managed-node3 19665 1727204188.89813: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204188.89819: Calling all_plugins_play to load vars for managed-node3 19665 1727204188.89821: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204188.89824: Calling groups_plugins_play to load vars for managed-node3 19665 1727204188.97238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204189.01736: done with get_vars() 19665 1727204189.01773: done getting variables 19665 1727204189.01822: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.198) 0:00:39.885 ***** 19665 1727204189.01860: entering _queue_task() for managed-node3/set_fact 19665 1727204189.02222: worker is 1 (out of 1 available) 19665 1727204189.02235: exiting _queue_task() for managed-node3/set_fact 19665 1727204189.02253: done queuing things up, now waiting for results queue to drain 19665 1727204189.02255: waiting for pending results... 19665 1727204189.02552: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 19665 1727204189.02698: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000048a 19665 1727204189.02727: variable 'ansible_search_path' from source: unknown 19665 1727204189.02735: variable 'ansible_search_path' from source: unknown 19665 1727204189.02780: calling self._execute() 19665 1727204189.03086: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204189.03259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204189.03405: variable 'omit' from source: magic vars 19665 1727204189.05023: variable 'ansible_distribution_major_version' from source: facts 19665 1727204189.05185: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204189.05197: variable 'omit' from source: magic vars 19665 1727204189.05425: variable 'omit' from source: magic vars 19665 1727204189.05495: variable 'omit' from source: magic vars 19665 1727204189.05571: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204189.05612: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204189.05647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204189.05676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204189.05694: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204189.05733: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204189.05746: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204189.05755: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204189.05867: Set connection var ansible_connection to ssh 19665 1727204189.05883: Set connection var ansible_shell_type to sh 19665 1727204189.05894: Set connection var ansible_timeout to 10 19665 1727204189.05905: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204189.05917: Set connection var ansible_shell_executable to /bin/sh 19665 1727204189.05929: Set connection var ansible_pipelining to False 19665 1727204189.05966: variable 'ansible_shell_executable' from source: unknown 19665 1727204189.05975: variable 'ansible_connection' from source: unknown 19665 1727204189.06005: variable 'ansible_module_compression' from source: unknown 19665 1727204189.06015: variable 'ansible_shell_type' from source: unknown 19665 1727204189.06023: variable 'ansible_shell_executable' from source: unknown 19665 1727204189.06030: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204189.06038: variable 'ansible_pipelining' from source: unknown 19665 1727204189.06049: variable 'ansible_timeout' from source: unknown 19665 1727204189.06063: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204189.06215: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204189.06232: variable 'omit' from source: magic vars 19665 1727204189.06246: starting attempt loop 19665 1727204189.06255: running the handler 19665 1727204189.06281: handler run complete 19665 1727204189.06295: attempt loop complete, returning result 19665 1727204189.06303: _execute() done 19665 1727204189.06310: dumping result to json 19665 1727204189.06317: done dumping result, returning 19665 1727204189.06328: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [0affcd87-79f5-0dcc-3ea6-00000000048a] 19665 1727204189.06338: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000048a ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 19665 1727204189.06598: no more pending results, returning what we have 19665 1727204189.06603: results queue empty 19665 1727204189.06606: checking for any_errors_fatal 19665 1727204189.06608: done checking for any_errors_fatal 19665 1727204189.06609: checking for max_fail_percentage 19665 1727204189.06613: done checking for max_fail_percentage 19665 1727204189.06615: checking to see if all hosts have failed and the running result is not ok 19665 1727204189.06617: done checking to see if all hosts have failed 19665 1727204189.06617: getting the remaining hosts for this loop 19665 1727204189.06619: done getting the remaining hosts for this loop 19665 1727204189.06624: getting the next task for host managed-node3 19665 1727204189.06632: done getting next task for host managed-node3 19665 1727204189.06635: ^ task is: TASK: Stat profile file 19665 1727204189.06639: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204189.06645: getting variables 19665 1727204189.06647: in VariableManager get_vars() 19665 1727204189.06685: Calling all_inventory to load vars for managed-node3 19665 1727204189.06691: Calling groups_inventory to load vars for managed-node3 19665 1727204189.06697: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204189.06715: Calling all_plugins_play to load vars for managed-node3 19665 1727204189.06721: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204189.06726: Calling groups_plugins_play to load vars for managed-node3 19665 1727204189.07596: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000048a 19665 1727204189.07600: WORKER PROCESS EXITING 19665 1727204189.08300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204189.10266: done with get_vars() 19665 1727204189.10294: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.085) 0:00:39.970 ***** 19665 1727204189.10394: entering _queue_task() for managed-node3/stat 19665 1727204189.11099: worker is 1 (out of 1 available) 19665 1727204189.11120: exiting _queue_task() for managed-node3/stat 19665 1727204189.11141: done queuing things up, now waiting for results queue to drain 19665 1727204189.11146: waiting for pending results... 19665 1727204189.11473: running TaskExecutor() for managed-node3/TASK: Stat profile file 19665 1727204189.11639: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000048b 19665 1727204189.11663: variable 'ansible_search_path' from source: unknown 19665 1727204189.11681: variable 'ansible_search_path' from source: unknown 19665 1727204189.11729: calling self._execute() 19665 1727204189.11830: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204189.11844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204189.11857: variable 'omit' from source: magic vars 19665 1727204189.12256: variable 'ansible_distribution_major_version' from source: facts 19665 1727204189.12274: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204189.12283: variable 'omit' from source: magic vars 19665 1727204189.12336: variable 'omit' from source: magic vars 19665 1727204189.12442: variable 'profile' from source: play vars 19665 1727204189.12454: variable 'interface' from source: set_fact 19665 1727204189.12522: variable 'interface' from source: set_fact 19665 1727204189.12549: variable 'omit' from source: magic vars 19665 1727204189.12596: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204189.12634: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204189.12663: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204189.12695: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204189.12712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204189.12749: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204189.12798: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204189.12806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204189.13018: Set connection var ansible_connection to ssh 19665 1727204189.13030: Set connection var ansible_shell_type to sh 19665 1727204189.13043: Set connection var ansible_timeout to 10 19665 1727204189.13124: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204189.13136: Set connection var ansible_shell_executable to /bin/sh 19665 1727204189.13152: Set connection var ansible_pipelining to False 19665 1727204189.13181: variable 'ansible_shell_executable' from source: unknown 19665 1727204189.13189: variable 'ansible_connection' from source: unknown 19665 1727204189.13195: variable 'ansible_module_compression' from source: unknown 19665 1727204189.13201: variable 'ansible_shell_type' from source: unknown 19665 1727204189.13230: variable 'ansible_shell_executable' from source: unknown 19665 1727204189.13237: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204189.13250: variable 'ansible_pipelining' from source: unknown 19665 1727204189.13342: variable 'ansible_timeout' from source: unknown 19665 1727204189.13353: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204189.13808: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204189.13825: variable 'omit' from source: magic vars 19665 1727204189.13835: starting attempt loop 19665 1727204189.13846: running the handler 19665 1727204189.13863: _low_level_execute_command(): starting 19665 1727204189.13881: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204189.15150: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204189.15171: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.15187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.15211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.15256: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.15271: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204189.15286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.15307: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204189.15320: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204189.15333: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204189.15348: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.15362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.15381: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.15395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.15406: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204189.15423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.15497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204189.15514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204189.15532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204189.15757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204189.17329: stdout chunk (state=3): >>>/root <<< 19665 1727204189.17522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204189.17525: stdout chunk (state=3): >>><<< 19665 1727204189.17527: stderr chunk (state=3): >>><<< 19665 1727204189.17646: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204189.17650: _low_level_execute_command(): starting 19665 1727204189.17653: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204189.175505-22880-105253027657167 `" && echo ansible-tmp-1727204189.175505-22880-105253027657167="` echo /root/.ansible/tmp/ansible-tmp-1727204189.175505-22880-105253027657167 `" ) && sleep 0' 19665 1727204189.19182: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204189.19195: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.19209: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.19227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.19277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.19289: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204189.19303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.19319: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204189.19329: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204189.19339: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204189.19354: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.19367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.19384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.19395: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.19405: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204189.19416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.19553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204189.19605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204189.19620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204189.19818: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204189.21671: stdout chunk (state=3): >>>ansible-tmp-1727204189.175505-22880-105253027657167=/root/.ansible/tmp/ansible-tmp-1727204189.175505-22880-105253027657167 <<< 19665 1727204189.21862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204189.21869: stdout chunk (state=3): >>><<< 19665 1727204189.21872: stderr chunk (state=3): >>><<< 19665 1727204189.22172: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204189.175505-22880-105253027657167=/root/.ansible/tmp/ansible-tmp-1727204189.175505-22880-105253027657167 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204189.22176: variable 'ansible_module_compression' from source: unknown 19665 1727204189.22178: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 19665 1727204189.22180: variable 'ansible_facts' from source: unknown 19665 1727204189.22182: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204189.175505-22880-105253027657167/AnsiballZ_stat.py 19665 1727204189.22624: Sending initial data 19665 1727204189.22627: Sent initial data (152 bytes) 19665 1727204189.25625: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204189.25722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.25743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.25766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.25819: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.25832: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204189.25851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.25873: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204189.25886: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204189.25897: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204189.25909: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.25926: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.25948: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.25980: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.25993: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204189.26007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.26196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204189.26213: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204189.26228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204189.26379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204189.28122: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204189.28162: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204189.28179: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp4xeckn7v /root/.ansible/tmp/ansible-tmp-1727204189.175505-22880-105253027657167/AnsiballZ_stat.py <<< 19665 1727204189.28213: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204189.29581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204189.29788: stderr chunk (state=3): >>><<< 19665 1727204189.29792: stdout chunk (state=3): >>><<< 19665 1727204189.29795: done transferring module to remote 19665 1727204189.29798: _low_level_execute_command(): starting 19665 1727204189.29805: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204189.175505-22880-105253027657167/ /root/.ansible/tmp/ansible-tmp-1727204189.175505-22880-105253027657167/AnsiballZ_stat.py && sleep 0' 19665 1727204189.31224: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204189.31282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.31298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.31316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.31476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.31492: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204189.31509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.31529: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204189.31543: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204189.31560: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204189.31576: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.31591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.31607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.31620: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.31630: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204189.31642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.31722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204189.31785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204189.31799: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204189.32013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204189.33681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204189.33752: stderr chunk (state=3): >>><<< 19665 1727204189.33755: stdout chunk (state=3): >>><<< 19665 1727204189.33847: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204189.33850: _low_level_execute_command(): starting 19665 1727204189.33853: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204189.175505-22880-105253027657167/AnsiballZ_stat.py && sleep 0' 19665 1727204189.35312: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204189.35328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.35344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.35366: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.35497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.35513: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204189.35527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.35545: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204189.35558: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204189.35572: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204189.35584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.35598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.35619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.35633: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.35644: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204189.35658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.35851: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204189.35869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204189.35886: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204189.36068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204189.49108: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 19665 1727204189.50189: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204189.50250: stderr chunk (state=3): >>><<< 19665 1727204189.50253: stdout chunk (state=3): >>><<< 19665 1727204189.50391: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204189.50403: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204189.175505-22880-105253027657167/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204189.50406: _low_level_execute_command(): starting 19665 1727204189.50409: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204189.175505-22880-105253027657167/ > /dev/null 2>&1 && sleep 0' 19665 1727204189.52501: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.52504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.52544: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204189.52548: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.52551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.52604: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204189.53175: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204189.53189: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204189.53262: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204189.55130: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204189.55133: stdout chunk (state=3): >>><<< 19665 1727204189.55136: stderr chunk (state=3): >>><<< 19665 1727204189.55469: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204189.55473: handler run complete 19665 1727204189.55475: attempt loop complete, returning result 19665 1727204189.55478: _execute() done 19665 1727204189.55480: dumping result to json 19665 1727204189.55482: done dumping result, returning 19665 1727204189.55484: done running TaskExecutor() for managed-node3/TASK: Stat profile file [0affcd87-79f5-0dcc-3ea6-00000000048b] 19665 1727204189.55486: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000048b 19665 1727204189.55561: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000048b 19665 1727204189.55567: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 19665 1727204189.55617: no more pending results, returning what we have 19665 1727204189.55622: results queue empty 19665 1727204189.55623: checking for any_errors_fatal 19665 1727204189.55628: done checking for any_errors_fatal 19665 1727204189.55629: checking for max_fail_percentage 19665 1727204189.55631: done checking for max_fail_percentage 19665 1727204189.55631: checking to see if all hosts have failed and the running result is not ok 19665 1727204189.55632: done checking to see if all hosts have failed 19665 1727204189.55633: getting the remaining hosts for this loop 19665 1727204189.55634: done getting the remaining hosts for this loop 19665 1727204189.55638: getting the next task for host managed-node3 19665 1727204189.55646: done getting next task for host managed-node3 19665 1727204189.55648: ^ task is: TASK: Set NM profile exist flag based on the profile files 19665 1727204189.55652: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204189.55655: getting variables 19665 1727204189.55657: in VariableManager get_vars() 19665 1727204189.55687: Calling all_inventory to load vars for managed-node3 19665 1727204189.55690: Calling groups_inventory to load vars for managed-node3 19665 1727204189.55694: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204189.55704: Calling all_plugins_play to load vars for managed-node3 19665 1727204189.55708: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204189.55711: Calling groups_plugins_play to load vars for managed-node3 19665 1727204189.58248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204189.60193: done with get_vars() 19665 1727204189.60226: done getting variables 19665 1727204189.60292: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.499) 0:00:40.469 ***** 19665 1727204189.60324: entering _queue_task() for managed-node3/set_fact 19665 1727204189.60678: worker is 1 (out of 1 available) 19665 1727204189.60691: exiting _queue_task() for managed-node3/set_fact 19665 1727204189.60705: done queuing things up, now waiting for results queue to drain 19665 1727204189.60706: waiting for pending results... 19665 1727204189.60995: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 19665 1727204189.61143: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000048c 19665 1727204189.61168: variable 'ansible_search_path' from source: unknown 19665 1727204189.61180: variable 'ansible_search_path' from source: unknown 19665 1727204189.61247: calling self._execute() 19665 1727204189.61553: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204189.61568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204189.61588: variable 'omit' from source: magic vars 19665 1727204189.62012: variable 'ansible_distribution_major_version' from source: facts 19665 1727204189.62035: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204189.62197: variable 'profile_stat' from source: set_fact 19665 1727204189.62215: Evaluated conditional (profile_stat.stat.exists): False 19665 1727204189.62224: when evaluation is False, skipping this task 19665 1727204189.62237: _execute() done 19665 1727204189.62251: dumping result to json 19665 1727204189.62259: done dumping result, returning 19665 1727204189.62271: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [0affcd87-79f5-0dcc-3ea6-00000000048c] 19665 1727204189.62281: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000048c skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19665 1727204189.62432: no more pending results, returning what we have 19665 1727204189.62436: results queue empty 19665 1727204189.62437: checking for any_errors_fatal 19665 1727204189.62451: done checking for any_errors_fatal 19665 1727204189.62452: checking for max_fail_percentage 19665 1727204189.62454: done checking for max_fail_percentage 19665 1727204189.62455: checking to see if all hosts have failed and the running result is not ok 19665 1727204189.62456: done checking to see if all hosts have failed 19665 1727204189.62456: getting the remaining hosts for this loop 19665 1727204189.62458: done getting the remaining hosts for this loop 19665 1727204189.62463: getting the next task for host managed-node3 19665 1727204189.62473: done getting next task for host managed-node3 19665 1727204189.62477: ^ task is: TASK: Get NM profile info 19665 1727204189.62481: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204189.62485: getting variables 19665 1727204189.62487: in VariableManager get_vars() 19665 1727204189.62517: Calling all_inventory to load vars for managed-node3 19665 1727204189.62520: Calling groups_inventory to load vars for managed-node3 19665 1727204189.62524: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204189.62537: Calling all_plugins_play to load vars for managed-node3 19665 1727204189.62544: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204189.62547: Calling groups_plugins_play to load vars for managed-node3 19665 1727204189.63673: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000048c 19665 1727204189.63678: WORKER PROCESS EXITING 19665 1727204189.64552: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204189.68362: done with get_vars() 19665 1727204189.68495: done getting variables 19665 1727204189.68554: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:56:29 -0400 (0:00:00.082) 0:00:40.553 ***** 19665 1727204189.68703: entering _queue_task() for managed-node3/shell 19665 1727204189.69380: worker is 1 (out of 1 available) 19665 1727204189.69395: exiting _queue_task() for managed-node3/shell 19665 1727204189.69408: done queuing things up, now waiting for results queue to drain 19665 1727204189.69410: waiting for pending results... 19665 1727204189.70298: running TaskExecutor() for managed-node3/TASK: Get NM profile info 19665 1727204189.70577: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000048d 19665 1727204189.70672: variable 'ansible_search_path' from source: unknown 19665 1727204189.70681: variable 'ansible_search_path' from source: unknown 19665 1727204189.70719: calling self._execute() 19665 1727204189.70856: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204189.70982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204189.71001: variable 'omit' from source: magic vars 19665 1727204189.71827: variable 'ansible_distribution_major_version' from source: facts 19665 1727204189.71873: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204189.71974: variable 'omit' from source: magic vars 19665 1727204189.72106: variable 'omit' from source: magic vars 19665 1727204189.72396: variable 'profile' from source: play vars 19665 1727204189.72405: variable 'interface' from source: set_fact 19665 1727204189.72477: variable 'interface' from source: set_fact 19665 1727204189.72622: variable 'omit' from source: magic vars 19665 1727204189.72671: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204189.72749: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204189.72844: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204189.72866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204189.72942: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204189.72980: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204189.72989: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204189.73042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204189.73260: Set connection var ansible_connection to ssh 19665 1727204189.73276: Set connection var ansible_shell_type to sh 19665 1727204189.73287: Set connection var ansible_timeout to 10 19665 1727204189.73296: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204189.73308: Set connection var ansible_shell_executable to /bin/sh 19665 1727204189.73320: Set connection var ansible_pipelining to False 19665 1727204189.73388: variable 'ansible_shell_executable' from source: unknown 19665 1727204189.73475: variable 'ansible_connection' from source: unknown 19665 1727204189.73485: variable 'ansible_module_compression' from source: unknown 19665 1727204189.73491: variable 'ansible_shell_type' from source: unknown 19665 1727204189.73497: variable 'ansible_shell_executable' from source: unknown 19665 1727204189.73503: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204189.73510: variable 'ansible_pipelining' from source: unknown 19665 1727204189.73515: variable 'ansible_timeout' from source: unknown 19665 1727204189.73522: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204189.73795: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204189.73928: variable 'omit' from source: magic vars 19665 1727204189.73938: starting attempt loop 19665 1727204189.73949: running the handler 19665 1727204189.73963: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204189.73989: _low_level_execute_command(): starting 19665 1727204189.74000: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204189.76087: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204189.76223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.76243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.76268: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.76320: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.76332: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204189.76349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.76369: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204189.76381: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204189.76392: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204189.76402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.76414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.76437: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.76452: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.76463: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204189.76479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.76680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204189.76697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204189.76712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204189.76878: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204189.78471: stdout chunk (state=3): >>>/root <<< 19665 1727204189.78589: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204189.78694: stderr chunk (state=3): >>><<< 19665 1727204189.78697: stdout chunk (state=3): >>><<< 19665 1727204189.78830: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204189.78841: _low_level_execute_command(): starting 19665 1727204189.78845: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204189.7872238-22946-142535689578137 `" && echo ansible-tmp-1727204189.7872238-22946-142535689578137="` echo /root/.ansible/tmp/ansible-tmp-1727204189.7872238-22946-142535689578137 `" ) && sleep 0' 19665 1727204189.80333: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.80337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.80421: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.80425: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.80427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.80598: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204189.80601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204189.80603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204189.80662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204189.82514: stdout chunk (state=3): >>>ansible-tmp-1727204189.7872238-22946-142535689578137=/root/.ansible/tmp/ansible-tmp-1727204189.7872238-22946-142535689578137 <<< 19665 1727204189.82620: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204189.82709: stderr chunk (state=3): >>><<< 19665 1727204189.82713: stdout chunk (state=3): >>><<< 19665 1727204189.82973: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204189.7872238-22946-142535689578137=/root/.ansible/tmp/ansible-tmp-1727204189.7872238-22946-142535689578137 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204189.82977: variable 'ansible_module_compression' from source: unknown 19665 1727204189.82979: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19665 1727204189.82981: variable 'ansible_facts' from source: unknown 19665 1727204189.82983: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204189.7872238-22946-142535689578137/AnsiballZ_command.py 19665 1727204189.83257: Sending initial data 19665 1727204189.83269: Sent initial data (156 bytes) 19665 1727204189.85394: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204189.85412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.85430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.85470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.85636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.85648: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204189.85662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.85687: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204189.85699: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204189.85708: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204189.85719: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.85731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.85745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.85757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.85807: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204189.85822: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.85935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204189.86028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204189.86043: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204189.86244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204189.88002: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204189.88040: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204189.88078: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpoqpu32mv /root/.ansible/tmp/ansible-tmp-1727204189.7872238-22946-142535689578137/AnsiballZ_command.py <<< 19665 1727204189.88119: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204189.89871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204189.90058: stderr chunk (state=3): >>><<< 19665 1727204189.90062: stdout chunk (state=3): >>><<< 19665 1727204189.90071: done transferring module to remote 19665 1727204189.90074: _low_level_execute_command(): starting 19665 1727204189.90076: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204189.7872238-22946-142535689578137/ /root/.ansible/tmp/ansible-tmp-1727204189.7872238-22946-142535689578137/AnsiballZ_command.py && sleep 0' 19665 1727204189.92362: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204189.92380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.92394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.92411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.92462: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.92562: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204189.92577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.92592: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204189.92602: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204189.92611: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204189.92620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.92631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204189.92649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.92668: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204189.92679: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204189.92691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.92769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204189.92896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204189.92912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204189.93111: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204189.94886: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204189.94957: stderr chunk (state=3): >>><<< 19665 1727204189.94960: stdout chunk (state=3): >>><<< 19665 1727204189.95067: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204189.95071: _low_level_execute_command(): starting 19665 1727204189.95074: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204189.7872238-22946-142535689578137/AnsiballZ_command.py && sleep 0' 19665 1727204189.96157: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.96160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204189.96197: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204189.96200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204189.96202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204189.96204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204189.96257: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204189.96893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204189.96896: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204189.96953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204190.12320: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:56:30.105237", "end": "2024-09-24 14:56:30.122083", "delta": "0:00:00.016846", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19665 1727204190.13619: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.15.87 closed. <<< 19665 1727204190.13698: stderr chunk (state=3): >>><<< 19665 1727204190.13701: stdout chunk (state=3): >>><<< 19665 1727204190.13844: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "start": "2024-09-24 14:56:30.105237", "end": "2024-09-24 14:56:30.122083", "delta": "0:00:00.016846", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.15.87 closed. 19665 1727204190.13853: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204189.7872238-22946-142535689578137/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204190.13856: _low_level_execute_command(): starting 19665 1727204190.13858: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204189.7872238-22946-142535689578137/ > /dev/null 2>&1 && sleep 0' 19665 1727204190.14481: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204190.14498: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204190.14513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204190.14540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204190.14585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204190.14598: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204190.14612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204190.14633: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204190.14646: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204190.14657: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204190.14674: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204190.14693: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204190.14708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204190.14721: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204190.14733: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204190.14750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204190.14825: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204190.14897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204190.14929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204190.15074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204190.16949: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204190.17131: stderr chunk (state=3): >>><<< 19665 1727204190.17148: stdout chunk (state=3): >>><<< 19665 1727204190.17248: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204190.17252: handler run complete 19665 1727204190.17329: Evaluated conditional (False): False 19665 1727204190.17384: attempt loop complete, returning result 19665 1727204190.17387: _execute() done 19665 1727204190.17389: dumping result to json 19665 1727204190.17397: done dumping result, returning 19665 1727204190.17422: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [0affcd87-79f5-0dcc-3ea6-00000000048d] 19665 1727204190.17426: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000048d 19665 1727204190.17546: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000048d 19665 1727204190.17550: WORKER PROCESS EXITING fatal: [managed-node3]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep LSR-TST-br31 | grep /etc", "delta": "0:00:00.016846", "end": "2024-09-24 14:56:30.122083", "rc": 1, "start": "2024-09-24 14:56:30.105237" } MSG: non-zero return code ...ignoring 19665 1727204190.17638: no more pending results, returning what we have 19665 1727204190.17645: results queue empty 19665 1727204190.17646: checking for any_errors_fatal 19665 1727204190.17654: done checking for any_errors_fatal 19665 1727204190.17655: checking for max_fail_percentage 19665 1727204190.17657: done checking for max_fail_percentage 19665 1727204190.17658: checking to see if all hosts have failed and the running result is not ok 19665 1727204190.17659: done checking to see if all hosts have failed 19665 1727204190.17659: getting the remaining hosts for this loop 19665 1727204190.17661: done getting the remaining hosts for this loop 19665 1727204190.17669: getting the next task for host managed-node3 19665 1727204190.17680: done getting next task for host managed-node3 19665 1727204190.17683: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 19665 1727204190.17686: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204190.17689: getting variables 19665 1727204190.17691: in VariableManager get_vars() 19665 1727204190.17728: Calling all_inventory to load vars for managed-node3 19665 1727204190.17731: Calling groups_inventory to load vars for managed-node3 19665 1727204190.17734: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204190.17752: Calling all_plugins_play to load vars for managed-node3 19665 1727204190.17756: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204190.17760: Calling groups_plugins_play to load vars for managed-node3 19665 1727204190.20278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204190.22419: done with get_vars() 19665 1727204190.22458: done getting variables 19665 1727204190.22555: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.539) 0:00:41.092 ***** 19665 1727204190.22611: entering _queue_task() for managed-node3/set_fact 19665 1727204190.23054: worker is 1 (out of 1 available) 19665 1727204190.23074: exiting _queue_task() for managed-node3/set_fact 19665 1727204190.23090: done queuing things up, now waiting for results queue to drain 19665 1727204190.23094: waiting for pending results... 19665 1727204190.23420: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 19665 1727204190.23604: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000048e 19665 1727204190.23627: variable 'ansible_search_path' from source: unknown 19665 1727204190.23634: variable 'ansible_search_path' from source: unknown 19665 1727204190.23719: calling self._execute() 19665 1727204190.23863: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204190.23879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204190.23895: variable 'omit' from source: magic vars 19665 1727204190.24375: variable 'ansible_distribution_major_version' from source: facts 19665 1727204190.24394: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204190.24575: variable 'nm_profile_exists' from source: set_fact 19665 1727204190.24594: Evaluated conditional (nm_profile_exists.rc == 0): False 19665 1727204190.24603: when evaluation is False, skipping this task 19665 1727204190.24610: _execute() done 19665 1727204190.24618: dumping result to json 19665 1727204190.24624: done dumping result, returning 19665 1727204190.24643: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [0affcd87-79f5-0dcc-3ea6-00000000048e] 19665 1727204190.24656: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000048e skipping: [managed-node3] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 19665 1727204190.24822: no more pending results, returning what we have 19665 1727204190.24827: results queue empty 19665 1727204190.24828: checking for any_errors_fatal 19665 1727204190.24837: done checking for any_errors_fatal 19665 1727204190.24838: checking for max_fail_percentage 19665 1727204190.24843: done checking for max_fail_percentage 19665 1727204190.24844: checking to see if all hosts have failed and the running result is not ok 19665 1727204190.24845: done checking to see if all hosts have failed 19665 1727204190.24845: getting the remaining hosts for this loop 19665 1727204190.24847: done getting the remaining hosts for this loop 19665 1727204190.24852: getting the next task for host managed-node3 19665 1727204190.24862: done getting next task for host managed-node3 19665 1727204190.24871: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 19665 1727204190.24875: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204190.24880: getting variables 19665 1727204190.24881: in VariableManager get_vars() 19665 1727204190.24914: Calling all_inventory to load vars for managed-node3 19665 1727204190.24916: Calling groups_inventory to load vars for managed-node3 19665 1727204190.24921: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204190.24935: Calling all_plugins_play to load vars for managed-node3 19665 1727204190.24939: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204190.24945: Calling groups_plugins_play to load vars for managed-node3 19665 1727204190.25986: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000048e 19665 1727204190.25989: WORKER PROCESS EXITING 19665 1727204190.27303: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204190.30001: done with get_vars() 19665 1727204190.30032: done getting variables 19665 1727204190.30108: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204190.30279: variable 'profile' from source: play vars 19665 1727204190.30283: variable 'interface' from source: set_fact 19665 1727204190.30355: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-LSR-TST-br31] ******************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.077) 0:00:41.170 ***** 19665 1727204190.30392: entering _queue_task() for managed-node3/command 19665 1727204190.30743: worker is 1 (out of 1 available) 19665 1727204190.30761: exiting _queue_task() for managed-node3/command 19665 1727204190.30775: done queuing things up, now waiting for results queue to drain 19665 1727204190.30777: waiting for pending results... 19665 1727204190.31073: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 19665 1727204190.31224: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000490 19665 1727204190.31249: variable 'ansible_search_path' from source: unknown 19665 1727204190.31257: variable 'ansible_search_path' from source: unknown 19665 1727204190.31299: calling self._execute() 19665 1727204190.31414: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204190.31433: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204190.31450: variable 'omit' from source: magic vars 19665 1727204190.31776: variable 'ansible_distribution_major_version' from source: facts 19665 1727204190.31787: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204190.31875: variable 'profile_stat' from source: set_fact 19665 1727204190.31885: Evaluated conditional (profile_stat.stat.exists): False 19665 1727204190.31888: when evaluation is False, skipping this task 19665 1727204190.31891: _execute() done 19665 1727204190.31894: dumping result to json 19665 1727204190.31896: done dumping result, returning 19665 1727204190.31902: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-LSR-TST-br31 [0affcd87-79f5-0dcc-3ea6-000000000490] 19665 1727204190.31907: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000490 19665 1727204190.31996: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000490 19665 1727204190.31999: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19665 1727204190.32044: no more pending results, returning what we have 19665 1727204190.32048: results queue empty 19665 1727204190.32048: checking for any_errors_fatal 19665 1727204190.32053: done checking for any_errors_fatal 19665 1727204190.32054: checking for max_fail_percentage 19665 1727204190.32055: done checking for max_fail_percentage 19665 1727204190.32056: checking to see if all hosts have failed and the running result is not ok 19665 1727204190.32057: done checking to see if all hosts have failed 19665 1727204190.32058: getting the remaining hosts for this loop 19665 1727204190.32059: done getting the remaining hosts for this loop 19665 1727204190.32065: getting the next task for host managed-node3 19665 1727204190.32073: done getting next task for host managed-node3 19665 1727204190.32076: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 19665 1727204190.32079: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204190.32083: getting variables 19665 1727204190.32084: in VariableManager get_vars() 19665 1727204190.32112: Calling all_inventory to load vars for managed-node3 19665 1727204190.32114: Calling groups_inventory to load vars for managed-node3 19665 1727204190.32118: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204190.32129: Calling all_plugins_play to load vars for managed-node3 19665 1727204190.32131: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204190.32134: Calling groups_plugins_play to load vars for managed-node3 19665 1727204190.32965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204190.34553: done with get_vars() 19665 1727204190.34572: done getting variables 19665 1727204190.34619: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204190.34702: variable 'profile' from source: play vars 19665 1727204190.34704: variable 'interface' from source: set_fact 19665 1727204190.34748: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-LSR-TST-br31] **************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.043) 0:00:41.214 ***** 19665 1727204190.34774: entering _queue_task() for managed-node3/set_fact 19665 1727204190.34998: worker is 1 (out of 1 available) 19665 1727204190.35011: exiting _queue_task() for managed-node3/set_fact 19665 1727204190.35022: done queuing things up, now waiting for results queue to drain 19665 1727204190.35024: waiting for pending results... 19665 1727204190.35200: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 19665 1727204190.35282: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000491 19665 1727204190.35295: variable 'ansible_search_path' from source: unknown 19665 1727204190.35298: variable 'ansible_search_path' from source: unknown 19665 1727204190.35326: calling self._execute() 19665 1727204190.35402: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204190.35405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204190.35413: variable 'omit' from source: magic vars 19665 1727204190.35683: variable 'ansible_distribution_major_version' from source: facts 19665 1727204190.35694: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204190.35779: variable 'profile_stat' from source: set_fact 19665 1727204190.35792: Evaluated conditional (profile_stat.stat.exists): False 19665 1727204190.35795: when evaluation is False, skipping this task 19665 1727204190.35798: _execute() done 19665 1727204190.35806: dumping result to json 19665 1727204190.35809: done dumping result, returning 19665 1727204190.35814: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-LSR-TST-br31 [0affcd87-79f5-0dcc-3ea6-000000000491] 19665 1727204190.35820: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000491 19665 1727204190.35904: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000491 19665 1727204190.35908: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19665 1727204190.35956: no more pending results, returning what we have 19665 1727204190.35960: results queue empty 19665 1727204190.35961: checking for any_errors_fatal 19665 1727204190.35970: done checking for any_errors_fatal 19665 1727204190.35970: checking for max_fail_percentage 19665 1727204190.35972: done checking for max_fail_percentage 19665 1727204190.35973: checking to see if all hosts have failed and the running result is not ok 19665 1727204190.35973: done checking to see if all hosts have failed 19665 1727204190.35974: getting the remaining hosts for this loop 19665 1727204190.35976: done getting the remaining hosts for this loop 19665 1727204190.35980: getting the next task for host managed-node3 19665 1727204190.35987: done getting next task for host managed-node3 19665 1727204190.35989: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 19665 1727204190.35993: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204190.35996: getting variables 19665 1727204190.35998: in VariableManager get_vars() 19665 1727204190.36030: Calling all_inventory to load vars for managed-node3 19665 1727204190.36032: Calling groups_inventory to load vars for managed-node3 19665 1727204190.36035: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204190.36048: Calling all_plugins_play to load vars for managed-node3 19665 1727204190.36050: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204190.36053: Calling groups_plugins_play to load vars for managed-node3 19665 1727204190.37598: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204190.39580: done with get_vars() 19665 1727204190.39623: done getting variables 19665 1727204190.39724: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204190.39918: variable 'profile' from source: play vars 19665 1727204190.39932: variable 'interface' from source: set_fact 19665 1727204190.39986: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-LSR-TST-br31] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.052) 0:00:41.266 ***** 19665 1727204190.40011: entering _queue_task() for managed-node3/command 19665 1727204190.40331: worker is 1 (out of 1 available) 19665 1727204190.40368: exiting _queue_task() for managed-node3/command 19665 1727204190.40380: done queuing things up, now waiting for results queue to drain 19665 1727204190.40382: waiting for pending results... 19665 1727204190.40577: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 19665 1727204190.40667: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000492 19665 1727204190.40683: variable 'ansible_search_path' from source: unknown 19665 1727204190.40687: variable 'ansible_search_path' from source: unknown 19665 1727204190.40716: calling self._execute() 19665 1727204190.40814: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204190.40817: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204190.40829: variable 'omit' from source: magic vars 19665 1727204190.41160: variable 'ansible_distribution_major_version' from source: facts 19665 1727204190.41172: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204190.41256: variable 'profile_stat' from source: set_fact 19665 1727204190.41271: Evaluated conditional (profile_stat.stat.exists): False 19665 1727204190.41274: when evaluation is False, skipping this task 19665 1727204190.41277: _execute() done 19665 1727204190.41279: dumping result to json 19665 1727204190.41282: done dumping result, returning 19665 1727204190.41285: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-LSR-TST-br31 [0affcd87-79f5-0dcc-3ea6-000000000492] 19665 1727204190.41292: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000492 19665 1727204190.41376: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000492 19665 1727204190.41379: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19665 1727204190.41428: no more pending results, returning what we have 19665 1727204190.41432: results queue empty 19665 1727204190.41433: checking for any_errors_fatal 19665 1727204190.41441: done checking for any_errors_fatal 19665 1727204190.41442: checking for max_fail_percentage 19665 1727204190.41444: done checking for max_fail_percentage 19665 1727204190.41445: checking to see if all hosts have failed and the running result is not ok 19665 1727204190.41445: done checking to see if all hosts have failed 19665 1727204190.41446: getting the remaining hosts for this loop 19665 1727204190.41448: done getting the remaining hosts for this loop 19665 1727204190.41452: getting the next task for host managed-node3 19665 1727204190.41459: done getting next task for host managed-node3 19665 1727204190.41461: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 19665 1727204190.41466: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204190.41470: getting variables 19665 1727204190.41471: in VariableManager get_vars() 19665 1727204190.41503: Calling all_inventory to load vars for managed-node3 19665 1727204190.41505: Calling groups_inventory to load vars for managed-node3 19665 1727204190.41508: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204190.41518: Calling all_plugins_play to load vars for managed-node3 19665 1727204190.41520: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204190.41523: Calling groups_plugins_play to load vars for managed-node3 19665 1727204190.42342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204190.43784: done with get_vars() 19665 1727204190.43802: done getting variables 19665 1727204190.43848: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204190.44062: variable 'profile' from source: play vars 19665 1727204190.44068: variable 'interface' from source: set_fact 19665 1727204190.44123: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-LSR-TST-br31] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.041) 0:00:41.308 ***** 19665 1727204190.44188: entering _queue_task() for managed-node3/set_fact 19665 1727204190.44608: worker is 1 (out of 1 available) 19665 1727204190.44621: exiting _queue_task() for managed-node3/set_fact 19665 1727204190.44634: done queuing things up, now waiting for results queue to drain 19665 1727204190.44635: waiting for pending results... 19665 1727204190.44870: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 19665 1727204190.45028: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000493 19665 1727204190.45040: variable 'ansible_search_path' from source: unknown 19665 1727204190.45043: variable 'ansible_search_path' from source: unknown 19665 1727204190.45076: calling self._execute() 19665 1727204190.45177: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204190.45184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204190.45235: variable 'omit' from source: magic vars 19665 1727204190.45603: variable 'ansible_distribution_major_version' from source: facts 19665 1727204190.45613: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204190.45713: variable 'profile_stat' from source: set_fact 19665 1727204190.45724: Evaluated conditional (profile_stat.stat.exists): False 19665 1727204190.45727: when evaluation is False, skipping this task 19665 1727204190.45730: _execute() done 19665 1727204190.45732: dumping result to json 19665 1727204190.45734: done dumping result, returning 19665 1727204190.45741: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-LSR-TST-br31 [0affcd87-79f5-0dcc-3ea6-000000000493] 19665 1727204190.45749: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000493 19665 1727204190.45834: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000493 19665 1727204190.45837: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 19665 1727204190.45889: no more pending results, returning what we have 19665 1727204190.45892: results queue empty 19665 1727204190.45893: checking for any_errors_fatal 19665 1727204190.45901: done checking for any_errors_fatal 19665 1727204190.45901: checking for max_fail_percentage 19665 1727204190.45903: done checking for max_fail_percentage 19665 1727204190.45904: checking to see if all hosts have failed and the running result is not ok 19665 1727204190.45905: done checking to see if all hosts have failed 19665 1727204190.45905: getting the remaining hosts for this loop 19665 1727204190.45907: done getting the remaining hosts for this loop 19665 1727204190.45911: getting the next task for host managed-node3 19665 1727204190.45920: done getting next task for host managed-node3 19665 1727204190.45923: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 19665 1727204190.45925: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204190.45929: getting variables 19665 1727204190.45931: in VariableManager get_vars() 19665 1727204190.45960: Calling all_inventory to load vars for managed-node3 19665 1727204190.45962: Calling groups_inventory to load vars for managed-node3 19665 1727204190.45966: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204190.45977: Calling all_plugins_play to load vars for managed-node3 19665 1727204190.45979: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204190.45982: Calling groups_plugins_play to load vars for managed-node3 19665 1727204190.46992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204190.48082: done with get_vars() 19665 1727204190.48099: done getting variables 19665 1727204190.48194: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204190.48308: variable 'profile' from source: play vars 19665 1727204190.48311: variable 'interface' from source: set_fact 19665 1727204190.48351: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'LSR-TST-br31'] ********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.041) 0:00:41.350 ***** 19665 1727204190.48377: entering _queue_task() for managed-node3/assert 19665 1727204190.48598: worker is 1 (out of 1 available) 19665 1727204190.48611: exiting _queue_task() for managed-node3/assert 19665 1727204190.48631: done queuing things up, now waiting for results queue to drain 19665 1727204190.48633: waiting for pending results... 19665 1727204190.48859: running TaskExecutor() for managed-node3/TASK: Assert that the profile is absent - 'LSR-TST-br31' 19665 1727204190.49075: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000480 19665 1727204190.49079: variable 'ansible_search_path' from source: unknown 19665 1727204190.49082: variable 'ansible_search_path' from source: unknown 19665 1727204190.49085: calling self._execute() 19665 1727204190.49178: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204190.49182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204190.49210: variable 'omit' from source: magic vars 19665 1727204190.49973: variable 'ansible_distribution_major_version' from source: facts 19665 1727204190.50031: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204190.50034: variable 'omit' from source: magic vars 19665 1727204190.50095: variable 'omit' from source: magic vars 19665 1727204190.50255: variable 'profile' from source: play vars 19665 1727204190.50259: variable 'interface' from source: set_fact 19665 1727204190.50352: variable 'interface' from source: set_fact 19665 1727204190.50388: variable 'omit' from source: magic vars 19665 1727204190.50425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204190.50510: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204190.50526: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204190.50548: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204190.50559: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204190.50592: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204190.50596: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204190.50599: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204190.50711: Set connection var ansible_connection to ssh 19665 1727204190.50719: Set connection var ansible_shell_type to sh 19665 1727204190.50724: Set connection var ansible_timeout to 10 19665 1727204190.50730: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204190.50738: Set connection var ansible_shell_executable to /bin/sh 19665 1727204190.50750: Set connection var ansible_pipelining to False 19665 1727204190.50791: variable 'ansible_shell_executable' from source: unknown 19665 1727204190.50794: variable 'ansible_connection' from source: unknown 19665 1727204190.50797: variable 'ansible_module_compression' from source: unknown 19665 1727204190.50799: variable 'ansible_shell_type' from source: unknown 19665 1727204190.50803: variable 'ansible_shell_executable' from source: unknown 19665 1727204190.50805: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204190.50807: variable 'ansible_pipelining' from source: unknown 19665 1727204190.50831: variable 'ansible_timeout' from source: unknown 19665 1727204190.50834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204190.51115: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204190.51122: variable 'omit' from source: magic vars 19665 1727204190.51128: starting attempt loop 19665 1727204190.51131: running the handler 19665 1727204190.51396: variable 'lsr_net_profile_exists' from source: set_fact 19665 1727204190.51402: Evaluated conditional (not lsr_net_profile_exists): True 19665 1727204190.51421: handler run complete 19665 1727204190.51456: attempt loop complete, returning result 19665 1727204190.51459: _execute() done 19665 1727204190.51462: dumping result to json 19665 1727204190.51476: done dumping result, returning 19665 1727204190.51479: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is absent - 'LSR-TST-br31' [0affcd87-79f5-0dcc-3ea6-000000000480] 19665 1727204190.51523: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000480 19665 1727204190.51633: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000480 19665 1727204190.51636: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 19665 1727204190.51692: no more pending results, returning what we have 19665 1727204190.51695: results queue empty 19665 1727204190.51696: checking for any_errors_fatal 19665 1727204190.51703: done checking for any_errors_fatal 19665 1727204190.51704: checking for max_fail_percentage 19665 1727204190.51705: done checking for max_fail_percentage 19665 1727204190.51706: checking to see if all hosts have failed and the running result is not ok 19665 1727204190.51707: done checking to see if all hosts have failed 19665 1727204190.51708: getting the remaining hosts for this loop 19665 1727204190.51709: done getting the remaining hosts for this loop 19665 1727204190.51713: getting the next task for host managed-node3 19665 1727204190.51722: done getting next task for host managed-node3 19665 1727204190.51724: ^ task is: TASK: meta (flush_handlers) 19665 1727204190.51728: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204190.51734: getting variables 19665 1727204190.51737: in VariableManager get_vars() 19665 1727204190.51772: Calling all_inventory to load vars for managed-node3 19665 1727204190.51774: Calling groups_inventory to load vars for managed-node3 19665 1727204190.51781: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204190.51791: Calling all_plugins_play to load vars for managed-node3 19665 1727204190.51794: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204190.51797: Calling groups_plugins_play to load vars for managed-node3 19665 1727204190.54376: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204190.55709: done with get_vars() 19665 1727204190.55746: done getting variables 19665 1727204190.55804: in VariableManager get_vars() 19665 1727204190.55811: Calling all_inventory to load vars for managed-node3 19665 1727204190.55813: Calling groups_inventory to load vars for managed-node3 19665 1727204190.55815: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204190.55828: Calling all_plugins_play to load vars for managed-node3 19665 1727204190.55831: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204190.55835: Calling groups_plugins_play to load vars for managed-node3 19665 1727204190.56713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204190.57742: done with get_vars() 19665 1727204190.57765: done queuing things up, now waiting for results queue to drain 19665 1727204190.57768: results queue empty 19665 1727204190.57768: checking for any_errors_fatal 19665 1727204190.57771: done checking for any_errors_fatal 19665 1727204190.57771: checking for max_fail_percentage 19665 1727204190.57772: done checking for max_fail_percentage 19665 1727204190.57773: checking to see if all hosts have failed and the running result is not ok 19665 1727204190.57779: done checking to see if all hosts have failed 19665 1727204190.57779: getting the remaining hosts for this loop 19665 1727204190.57780: done getting the remaining hosts for this loop 19665 1727204190.57787: getting the next task for host managed-node3 19665 1727204190.57800: done getting next task for host managed-node3 19665 1727204190.57802: ^ task is: TASK: meta (flush_handlers) 19665 1727204190.57804: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204190.57807: getting variables 19665 1727204190.57808: in VariableManager get_vars() 19665 1727204190.57818: Calling all_inventory to load vars for managed-node3 19665 1727204190.57820: Calling groups_inventory to load vars for managed-node3 19665 1727204190.57821: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204190.57826: Calling all_plugins_play to load vars for managed-node3 19665 1727204190.57827: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204190.57829: Calling groups_plugins_play to load vars for managed-node3 19665 1727204190.58790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204190.60084: done with get_vars() 19665 1727204190.60114: done getting variables 19665 1727204190.60174: in VariableManager get_vars() 19665 1727204190.60193: Calling all_inventory to load vars for managed-node3 19665 1727204190.60198: Calling groups_inventory to load vars for managed-node3 19665 1727204190.60201: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204190.60210: Calling all_plugins_play to load vars for managed-node3 19665 1727204190.60214: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204190.60217: Calling groups_plugins_play to load vars for managed-node3 19665 1727204190.61147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204190.62153: done with get_vars() 19665 1727204190.62177: done queuing things up, now waiting for results queue to drain 19665 1727204190.62179: results queue empty 19665 1727204190.62179: checking for any_errors_fatal 19665 1727204190.62180: done checking for any_errors_fatal 19665 1727204190.62181: checking for max_fail_percentage 19665 1727204190.62181: done checking for max_fail_percentage 19665 1727204190.62182: checking to see if all hosts have failed and the running result is not ok 19665 1727204190.62182: done checking to see if all hosts have failed 19665 1727204190.62183: getting the remaining hosts for this loop 19665 1727204190.62184: done getting the remaining hosts for this loop 19665 1727204190.62185: getting the next task for host managed-node3 19665 1727204190.62188: done getting next task for host managed-node3 19665 1727204190.62188: ^ task is: None 19665 1727204190.62190: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204190.62191: done queuing things up, now waiting for results queue to drain 19665 1727204190.62191: results queue empty 19665 1727204190.62192: checking for any_errors_fatal 19665 1727204190.62193: done checking for any_errors_fatal 19665 1727204190.62194: checking for max_fail_percentage 19665 1727204190.62194: done checking for max_fail_percentage 19665 1727204190.62195: checking to see if all hosts have failed and the running result is not ok 19665 1727204190.62195: done checking to see if all hosts have failed 19665 1727204190.62196: getting the next task for host managed-node3 19665 1727204190.62197: done getting next task for host managed-node3 19665 1727204190.62198: ^ task is: None 19665 1727204190.62199: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204190.62235: in VariableManager get_vars() 19665 1727204190.62248: done with get_vars() 19665 1727204190.62252: in VariableManager get_vars() 19665 1727204190.62258: done with get_vars() 19665 1727204190.62261: variable 'omit' from source: magic vars 19665 1727204190.62348: variable 'task' from source: play vars 19665 1727204190.62372: in VariableManager get_vars() 19665 1727204190.62379: done with get_vars() 19665 1727204190.62392: variable 'omit' from source: magic vars PLAY [Run the tasklist tasks/assert_device_absent.yml] ************************* 19665 1727204190.62510: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19665 1727204190.62531: getting the remaining hosts for this loop 19665 1727204190.62532: done getting the remaining hosts for this loop 19665 1727204190.62533: getting the next task for host managed-node3 19665 1727204190.62536: done getting next task for host managed-node3 19665 1727204190.62538: ^ task is: TASK: Gathering Facts 19665 1727204190.62539: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204190.62542: getting variables 19665 1727204190.62542: in VariableManager get_vars() 19665 1727204190.62548: Calling all_inventory to load vars for managed-node3 19665 1727204190.62549: Calling groups_inventory to load vars for managed-node3 19665 1727204190.62551: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204190.62555: Calling all_plugins_play to load vars for managed-node3 19665 1727204190.62557: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204190.62559: Calling groups_plugins_play to load vars for managed-node3 19665 1727204190.63431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204190.64331: done with get_vars() 19665 1727204190.64347: done getting variables 19665 1727204190.64379: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Tuesday 24 September 2024 14:56:30 -0400 (0:00:00.160) 0:00:41.510 ***** 19665 1727204190.64399: entering _queue_task() for managed-node3/gather_facts 19665 1727204190.64629: worker is 1 (out of 1 available) 19665 1727204190.64643: exiting _queue_task() for managed-node3/gather_facts 19665 1727204190.64654: done queuing things up, now waiting for results queue to drain 19665 1727204190.64656: waiting for pending results... 19665 1727204190.64930: running TaskExecutor() for managed-node3/TASK: Gathering Facts 19665 1727204190.65020: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000004c5 19665 1727204190.65031: variable 'ansible_search_path' from source: unknown 19665 1727204190.65067: calling self._execute() 19665 1727204190.65148: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204190.65185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204190.65201: variable 'omit' from source: magic vars 19665 1727204190.65489: variable 'ansible_distribution_major_version' from source: facts 19665 1727204190.65499: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204190.65507: variable 'omit' from source: magic vars 19665 1727204190.65527: variable 'omit' from source: magic vars 19665 1727204190.65556: variable 'omit' from source: magic vars 19665 1727204190.65589: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204190.65624: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204190.65640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204190.65656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204190.65667: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204190.65691: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204190.65694: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204190.65697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204190.65768: Set connection var ansible_connection to ssh 19665 1727204190.65774: Set connection var ansible_shell_type to sh 19665 1727204190.65780: Set connection var ansible_timeout to 10 19665 1727204190.65785: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204190.65791: Set connection var ansible_shell_executable to /bin/sh 19665 1727204190.65798: Set connection var ansible_pipelining to False 19665 1727204190.65814: variable 'ansible_shell_executable' from source: unknown 19665 1727204190.65818: variable 'ansible_connection' from source: unknown 19665 1727204190.65821: variable 'ansible_module_compression' from source: unknown 19665 1727204190.65823: variable 'ansible_shell_type' from source: unknown 19665 1727204190.65826: variable 'ansible_shell_executable' from source: unknown 19665 1727204190.65828: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204190.65830: variable 'ansible_pipelining' from source: unknown 19665 1727204190.65832: variable 'ansible_timeout' from source: unknown 19665 1727204190.65834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204190.65971: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204190.65980: variable 'omit' from source: magic vars 19665 1727204190.65985: starting attempt loop 19665 1727204190.65987: running the handler 19665 1727204190.66000: variable 'ansible_facts' from source: unknown 19665 1727204190.66015: _low_level_execute_command(): starting 19665 1727204190.66022: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204190.66550: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204190.66569: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204190.66586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204190.66605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204190.66622: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204190.66657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204190.66672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204190.66731: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204190.68386: stdout chunk (state=3): >>>/root <<< 19665 1727204190.68489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204190.68547: stderr chunk (state=3): >>><<< 19665 1727204190.68550: stdout chunk (state=3): >>><<< 19665 1727204190.68573: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204190.68584: _low_level_execute_command(): starting 19665 1727204190.68590: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204190.6857328-23059-269408449234056 `" && echo ansible-tmp-1727204190.6857328-23059-269408449234056="` echo /root/.ansible/tmp/ansible-tmp-1727204190.6857328-23059-269408449234056 `" ) && sleep 0' 19665 1727204190.69033: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204190.69049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204190.69066: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204190.69080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204190.69108: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204190.69139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204190.69151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204190.69199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204190.71091: stdout chunk (state=3): >>>ansible-tmp-1727204190.6857328-23059-269408449234056=/root/.ansible/tmp/ansible-tmp-1727204190.6857328-23059-269408449234056 <<< 19665 1727204190.71192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204190.71252: stderr chunk (state=3): >>><<< 19665 1727204190.71256: stdout chunk (state=3): >>><<< 19665 1727204190.71275: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204190.6857328-23059-269408449234056=/root/.ansible/tmp/ansible-tmp-1727204190.6857328-23059-269408449234056 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204190.71301: variable 'ansible_module_compression' from source: unknown 19665 1727204190.71342: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19665 1727204190.71398: variable 'ansible_facts' from source: unknown 19665 1727204190.71519: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204190.6857328-23059-269408449234056/AnsiballZ_setup.py 19665 1727204190.71646: Sending initial data 19665 1727204190.71656: Sent initial data (154 bytes) 19665 1727204190.72348: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204190.72352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204190.72393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204190.72396: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204190.72399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204190.72401: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204190.72453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204190.72457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204190.72459: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204190.72509: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204190.74284: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204190.74327: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204190.74368: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmp4z7_mnfh /root/.ansible/tmp/ansible-tmp-1727204190.6857328-23059-269408449234056/AnsiballZ_setup.py <<< 19665 1727204190.74408: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204190.76093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204190.76202: stderr chunk (state=3): >>><<< 19665 1727204190.76205: stdout chunk (state=3): >>><<< 19665 1727204190.76224: done transferring module to remote 19665 1727204190.76233: _low_level_execute_command(): starting 19665 1727204190.76237: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204190.6857328-23059-269408449234056/ /root/.ansible/tmp/ansible-tmp-1727204190.6857328-23059-269408449234056/AnsiballZ_setup.py && sleep 0' 19665 1727204190.76708: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204190.76721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204190.76738: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204190.76750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204190.76761: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204190.76809: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204190.76827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204190.76866: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204190.78695: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204190.78746: stderr chunk (state=3): >>><<< 19665 1727204190.78749: stdout chunk (state=3): >>><<< 19665 1727204190.78761: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204190.78770: _low_level_execute_command(): starting 19665 1727204190.78772: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204190.6857328-23059-269408449234056/AnsiballZ_setup.py && sleep 0' 19665 1727204190.79201: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204190.79214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204190.79232: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204190.79244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found <<< 19665 1727204190.79255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204190.79300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204190.79312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204190.79367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204191.30095: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "31", "epoch": "1727204191", "epoch_int": "1727204191", "date": "2024-09-24", "time": "14:56:31", "iso8601_micro": "2024-09-24T18:56:31.030126Z", "iso8601": "2024-09-24T18:56:31Z", "iso8601_basic": "20240924T145631030126", "iso8601_basic_short": "20240924T145631", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2812, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 720, "free": 2812}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 537, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282144768, "block_size": 4096, "block_total": 65519355, "block_available": 64522008, "block_used": 997347, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.42, "5m": 0.35, "15m": 0.18}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19665 1727204191.31843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204191.31920: stderr chunk (state=3): >>><<< 19665 1727204191.31923: stdout chunk (state=3): >>><<< 19665 1727204191.31962: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "31", "epoch": "1727204191", "epoch_int": "1727204191", "date": "2024-09-24", "time": "14:56:31", "iso8601_micro": "2024-09-24T18:56:31.030126Z", "iso8601": "2024-09-24T18:56:31Z", "iso8601_basic": "20240924T145631030126", "iso8601_basic_short": "20240924T145631", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_lsb": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_apparmor": {"status": "disabled"}, "ansible_local": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2812, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 720, "free": 2812}, "nocache": {"free": 3271, "used": 261}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 537, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282144768, "block_size": 4096, "block_total": 65519355, "block_available": 64522008, "block_used": 997347, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_iscsi_iqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.42, "5m": 0.35, "15m": 0.18}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_is_chroot": false, "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204191.32267: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204190.6857328-23059-269408449234056/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204191.32286: _low_level_execute_command(): starting 19665 1727204191.32296: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204190.6857328-23059-269408449234056/ > /dev/null 2>&1 && sleep 0' 19665 1727204191.32924: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204191.32929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204191.32969: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204191.32973: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204191.32989: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204191.33056: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204191.33106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204191.34956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204191.35030: stderr chunk (state=3): >>><<< 19665 1727204191.35033: stdout chunk (state=3): >>><<< 19665 1727204191.35061: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204191.35066: handler run complete 19665 1727204191.35211: variable 'ansible_facts' from source: unknown 19665 1727204191.35280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204191.35520: variable 'ansible_facts' from source: unknown 19665 1727204191.35578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204191.35656: attempt loop complete, returning result 19665 1727204191.35660: _execute() done 19665 1727204191.35662: dumping result to json 19665 1727204191.35683: done dumping result, returning 19665 1727204191.35690: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-0dcc-3ea6-0000000004c5] 19665 1727204191.35695: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000004c5 19665 1727204191.35987: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000004c5 19665 1727204191.35990: WORKER PROCESS EXITING ok: [managed-node3] 19665 1727204191.36230: no more pending results, returning what we have 19665 1727204191.36232: results queue empty 19665 1727204191.36233: checking for any_errors_fatal 19665 1727204191.36234: done checking for any_errors_fatal 19665 1727204191.36234: checking for max_fail_percentage 19665 1727204191.36235: done checking for max_fail_percentage 19665 1727204191.36236: checking to see if all hosts have failed and the running result is not ok 19665 1727204191.36237: done checking to see if all hosts have failed 19665 1727204191.36237: getting the remaining hosts for this loop 19665 1727204191.36238: done getting the remaining hosts for this loop 19665 1727204191.36241: getting the next task for host managed-node3 19665 1727204191.36247: done getting next task for host managed-node3 19665 1727204191.36251: ^ task is: TASK: meta (flush_handlers) 19665 1727204191.36254: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204191.36258: getting variables 19665 1727204191.36259: in VariableManager get_vars() 19665 1727204191.36283: Calling all_inventory to load vars for managed-node3 19665 1727204191.36285: Calling groups_inventory to load vars for managed-node3 19665 1727204191.36289: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204191.36299: Calling all_plugins_play to load vars for managed-node3 19665 1727204191.36304: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204191.36308: Calling groups_plugins_play to load vars for managed-node3 19665 1727204191.37478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204191.38601: done with get_vars() 19665 1727204191.38623: done getting variables 19665 1727204191.38713: in VariableManager get_vars() 19665 1727204191.38723: Calling all_inventory to load vars for managed-node3 19665 1727204191.38726: Calling groups_inventory to load vars for managed-node3 19665 1727204191.38728: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204191.38732: Calling all_plugins_play to load vars for managed-node3 19665 1727204191.38734: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204191.38735: Calling groups_plugins_play to load vars for managed-node3 19665 1727204191.39561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204191.40638: done with get_vars() 19665 1727204191.40658: done queuing things up, now waiting for results queue to drain 19665 1727204191.40660: results queue empty 19665 1727204191.40660: checking for any_errors_fatal 19665 1727204191.40666: done checking for any_errors_fatal 19665 1727204191.40666: checking for max_fail_percentage 19665 1727204191.40667: done checking for max_fail_percentage 19665 1727204191.40667: checking to see if all hosts have failed and the running result is not ok 19665 1727204191.40672: done checking to see if all hosts have failed 19665 1727204191.40673: getting the remaining hosts for this loop 19665 1727204191.40673: done getting the remaining hosts for this loop 19665 1727204191.40675: getting the next task for host managed-node3 19665 1727204191.40678: done getting next task for host managed-node3 19665 1727204191.40680: ^ task is: TASK: Include the task '{{ task }}' 19665 1727204191.40681: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204191.40683: getting variables 19665 1727204191.40683: in VariableManager get_vars() 19665 1727204191.40690: Calling all_inventory to load vars for managed-node3 19665 1727204191.40691: Calling groups_inventory to load vars for managed-node3 19665 1727204191.40693: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204191.40697: Calling all_plugins_play to load vars for managed-node3 19665 1727204191.40698: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204191.40700: Calling groups_plugins_play to load vars for managed-node3 19665 1727204191.41518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204191.42669: done with get_vars() 19665 1727204191.42684: done getting variables 19665 1727204191.42805: variable 'task' from source: play vars TASK [Include the task 'tasks/assert_device_absent.yml'] *********************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:6 Tuesday 24 September 2024 14:56:31 -0400 (0:00:00.784) 0:00:42.295 ***** 19665 1727204191.42839: entering _queue_task() for managed-node3/include_tasks 19665 1727204191.43085: worker is 1 (out of 1 available) 19665 1727204191.43098: exiting _queue_task() for managed-node3/include_tasks 19665 1727204191.43109: done queuing things up, now waiting for results queue to drain 19665 1727204191.43111: waiting for pending results... 19665 1727204191.43384: running TaskExecutor() for managed-node3/TASK: Include the task 'tasks/assert_device_absent.yml' 19665 1727204191.43483: in run() - task 0affcd87-79f5-0dcc-3ea6-000000000077 19665 1727204191.43494: variable 'ansible_search_path' from source: unknown 19665 1727204191.43530: calling self._execute() 19665 1727204191.43635: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204191.43650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204191.43669: variable 'omit' from source: magic vars 19665 1727204191.44084: variable 'ansible_distribution_major_version' from source: facts 19665 1727204191.44095: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204191.44101: variable 'task' from source: play vars 19665 1727204191.44153: variable 'task' from source: play vars 19665 1727204191.44159: _execute() done 19665 1727204191.44162: dumping result to json 19665 1727204191.44167: done dumping result, returning 19665 1727204191.44178: done running TaskExecutor() for managed-node3/TASK: Include the task 'tasks/assert_device_absent.yml' [0affcd87-79f5-0dcc-3ea6-000000000077] 19665 1727204191.44200: sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000077 19665 1727204191.44391: no more pending results, returning what we have 19665 1727204191.44400: in VariableManager get_vars() 19665 1727204191.44474: Calling all_inventory to load vars for managed-node3 19665 1727204191.44480: Calling groups_inventory to load vars for managed-node3 19665 1727204191.44490: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204191.44511: Calling all_plugins_play to load vars for managed-node3 19665 1727204191.44517: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204191.44520: Calling groups_plugins_play to load vars for managed-node3 19665 1727204191.45116: done sending task result for task 0affcd87-79f5-0dcc-3ea6-000000000077 19665 1727204191.45120: WORKER PROCESS EXITING 19665 1727204191.46212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204191.52950: done with get_vars() 19665 1727204191.52968: variable 'ansible_search_path' from source: unknown 19665 1727204191.52979: we have included files to process 19665 1727204191.52980: generating all_blocks data 19665 1727204191.52980: done generating all_blocks data 19665 1727204191.52981: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19665 1727204191.52982: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19665 1727204191.52983: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 19665 1727204191.53047: in VariableManager get_vars() 19665 1727204191.53060: done with get_vars() 19665 1727204191.53129: done processing included file 19665 1727204191.53131: iterating over new_blocks loaded from include file 19665 1727204191.53131: in VariableManager get_vars() 19665 1727204191.53138: done with get_vars() 19665 1727204191.53139: filtering new block on tags 19665 1727204191.53152: done filtering new block on tags 19665 1727204191.53154: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node3 19665 1727204191.53156: extending task lists for all hosts with included blocks 19665 1727204191.53180: done extending task lists 19665 1727204191.53181: done processing included files 19665 1727204191.53181: results queue empty 19665 1727204191.53182: checking for any_errors_fatal 19665 1727204191.53183: done checking for any_errors_fatal 19665 1727204191.53183: checking for max_fail_percentage 19665 1727204191.53184: done checking for max_fail_percentage 19665 1727204191.53184: checking to see if all hosts have failed and the running result is not ok 19665 1727204191.53185: done checking to see if all hosts have failed 19665 1727204191.53185: getting the remaining hosts for this loop 19665 1727204191.53186: done getting the remaining hosts for this loop 19665 1727204191.53187: getting the next task for host managed-node3 19665 1727204191.53190: done getting next task for host managed-node3 19665 1727204191.53191: ^ task is: TASK: Include the task 'get_interface_stat.yml' 19665 1727204191.53192: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204191.53194: getting variables 19665 1727204191.53194: in VariableManager get_vars() 19665 1727204191.53200: Calling all_inventory to load vars for managed-node3 19665 1727204191.53201: Calling groups_inventory to load vars for managed-node3 19665 1727204191.53202: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204191.53206: Calling all_plugins_play to load vars for managed-node3 19665 1727204191.53208: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204191.53209: Calling groups_plugins_play to load vars for managed-node3 19665 1727204191.57106: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204191.59119: done with get_vars() 19665 1727204191.59151: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:56:31 -0400 (0:00:00.163) 0:00:42.458 ***** 19665 1727204191.59228: entering _queue_task() for managed-node3/include_tasks 19665 1727204191.59579: worker is 1 (out of 1 available) 19665 1727204191.59591: exiting _queue_task() for managed-node3/include_tasks 19665 1727204191.59604: done queuing things up, now waiting for results queue to drain 19665 1727204191.59606: waiting for pending results... 19665 1727204191.59927: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 19665 1727204191.60092: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000004d6 19665 1727204191.60109: variable 'ansible_search_path' from source: unknown 19665 1727204191.60115: variable 'ansible_search_path' from source: unknown 19665 1727204191.60158: calling self._execute() 19665 1727204191.60267: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204191.60278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204191.60295: variable 'omit' from source: magic vars 19665 1727204191.61168: variable 'ansible_distribution_major_version' from source: facts 19665 1727204191.61187: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204191.61197: _execute() done 19665 1727204191.61222: dumping result to json 19665 1727204191.61230: done dumping result, returning 19665 1727204191.61240: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [0affcd87-79f5-0dcc-3ea6-0000000004d6] 19665 1727204191.61277: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000004d6 19665 1727204191.61424: no more pending results, returning what we have 19665 1727204191.61429: in VariableManager get_vars() 19665 1727204191.61467: Calling all_inventory to load vars for managed-node3 19665 1727204191.61470: Calling groups_inventory to load vars for managed-node3 19665 1727204191.61475: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204191.61489: Calling all_plugins_play to load vars for managed-node3 19665 1727204191.61492: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204191.61495: Calling groups_plugins_play to load vars for managed-node3 19665 1727204191.62773: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000004d6 19665 1727204191.62777: WORKER PROCESS EXITING 19665 1727204191.64103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204191.66042: done with get_vars() 19665 1727204191.66069: variable 'ansible_search_path' from source: unknown 19665 1727204191.66071: variable 'ansible_search_path' from source: unknown 19665 1727204191.66082: variable 'task' from source: play vars 19665 1727204191.66205: variable 'task' from source: play vars 19665 1727204191.66245: we have included files to process 19665 1727204191.66246: generating all_blocks data 19665 1727204191.66248: done generating all_blocks data 19665 1727204191.66249: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19665 1727204191.66250: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19665 1727204191.66256: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 19665 1727204191.66450: done processing included file 19665 1727204191.66452: iterating over new_blocks loaded from include file 19665 1727204191.66454: in VariableManager get_vars() 19665 1727204191.66473: done with get_vars() 19665 1727204191.66475: filtering new block on tags 19665 1727204191.66492: done filtering new block on tags 19665 1727204191.66494: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 19665 1727204191.66499: extending task lists for all hosts with included blocks 19665 1727204191.66613: done extending task lists 19665 1727204191.66615: done processing included files 19665 1727204191.66616: results queue empty 19665 1727204191.66617: checking for any_errors_fatal 19665 1727204191.66621: done checking for any_errors_fatal 19665 1727204191.66622: checking for max_fail_percentage 19665 1727204191.66623: done checking for max_fail_percentage 19665 1727204191.66624: checking to see if all hosts have failed and the running result is not ok 19665 1727204191.66625: done checking to see if all hosts have failed 19665 1727204191.66625: getting the remaining hosts for this loop 19665 1727204191.66627: done getting the remaining hosts for this loop 19665 1727204191.66630: getting the next task for host managed-node3 19665 1727204191.66634: done getting next task for host managed-node3 19665 1727204191.66636: ^ task is: TASK: Get stat for interface {{ interface }} 19665 1727204191.66639: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204191.66641: getting variables 19665 1727204191.66641: in VariableManager get_vars() 19665 1727204191.66650: Calling all_inventory to load vars for managed-node3 19665 1727204191.66652: Calling groups_inventory to load vars for managed-node3 19665 1727204191.66654: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204191.66660: Calling all_plugins_play to load vars for managed-node3 19665 1727204191.66662: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204191.66667: Calling groups_plugins_play to load vars for managed-node3 19665 1727204191.68272: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204191.69258: done with get_vars() 19665 1727204191.69281: done getting variables 19665 1727204191.69433: variable 'interface' from source: set_fact TASK [Get stat for interface LSR-TST-br31] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:56:31 -0400 (0:00:00.102) 0:00:42.561 ***** 19665 1727204191.69462: entering _queue_task() for managed-node3/stat 19665 1727204191.69824: worker is 1 (out of 1 available) 19665 1727204191.69841: exiting _queue_task() for managed-node3/stat 19665 1727204191.69856: done queuing things up, now waiting for results queue to drain 19665 1727204191.69858: waiting for pending results... 19665 1727204191.70171: running TaskExecutor() for managed-node3/TASK: Get stat for interface LSR-TST-br31 19665 1727204191.70318: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000004e1 19665 1727204191.70338: variable 'ansible_search_path' from source: unknown 19665 1727204191.70346: variable 'ansible_search_path' from source: unknown 19665 1727204191.70393: calling self._execute() 19665 1727204191.70503: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204191.70519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204191.70532: variable 'omit' from source: magic vars 19665 1727204191.70933: variable 'ansible_distribution_major_version' from source: facts 19665 1727204191.70961: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204191.70974: variable 'omit' from source: magic vars 19665 1727204191.71027: variable 'omit' from source: magic vars 19665 1727204191.71140: variable 'interface' from source: set_fact 19665 1727204191.71175: variable 'omit' from source: magic vars 19665 1727204191.71222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204191.71276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204191.71304: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204191.71326: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204191.71343: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204191.71387: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204191.71395: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204191.71404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204191.71512: Set connection var ansible_connection to ssh 19665 1727204191.71524: Set connection var ansible_shell_type to sh 19665 1727204191.71534: Set connection var ansible_timeout to 10 19665 1727204191.71542: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204191.71555: Set connection var ansible_shell_executable to /bin/sh 19665 1727204191.71570: Set connection var ansible_pipelining to False 19665 1727204191.71603: variable 'ansible_shell_executable' from source: unknown 19665 1727204191.71612: variable 'ansible_connection' from source: unknown 19665 1727204191.71619: variable 'ansible_module_compression' from source: unknown 19665 1727204191.71629: variable 'ansible_shell_type' from source: unknown 19665 1727204191.71636: variable 'ansible_shell_executable' from source: unknown 19665 1727204191.71641: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204191.71652: variable 'ansible_pipelining' from source: unknown 19665 1727204191.71659: variable 'ansible_timeout' from source: unknown 19665 1727204191.71669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204191.71898: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 19665 1727204191.71915: variable 'omit' from source: magic vars 19665 1727204191.71932: starting attempt loop 19665 1727204191.71940: running the handler 19665 1727204191.71958: _low_level_execute_command(): starting 19665 1727204191.71973: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204191.72772: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204191.72788: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204191.72803: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204191.72827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204191.72873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204191.72885: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204191.72898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204191.72921: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204191.72934: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204191.72944: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204191.72957: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204191.72974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204191.72989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204191.73001: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204191.73012: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204191.73031: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204191.73109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204191.73133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204191.73155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204191.73234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204191.74865: stdout chunk (state=3): >>>/root <<< 19665 1727204191.75069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204191.75072: stdout chunk (state=3): >>><<< 19665 1727204191.75075: stderr chunk (state=3): >>><<< 19665 1727204191.75192: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204191.75195: _low_level_execute_command(): starting 19665 1727204191.75199: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204191.7509627-23093-138520870585634 `" && echo ansible-tmp-1727204191.7509627-23093-138520870585634="` echo /root/.ansible/tmp/ansible-tmp-1727204191.7509627-23093-138520870585634 `" ) && sleep 0' 19665 1727204191.76050: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204191.76056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204191.76090: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204191.76093: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204191.76096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204191.76172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204191.76381: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204191.76476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204191.78672: stdout chunk (state=3): >>>ansible-tmp-1727204191.7509627-23093-138520870585634=/root/.ansible/tmp/ansible-tmp-1727204191.7509627-23093-138520870585634 <<< 19665 1727204191.78676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204191.78956: stderr chunk (state=3): >>><<< 19665 1727204191.78960: stdout chunk (state=3): >>><<< 19665 1727204191.79174: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204191.7509627-23093-138520870585634=/root/.ansible/tmp/ansible-tmp-1727204191.7509627-23093-138520870585634 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204191.79180: variable 'ansible_module_compression' from source: unknown 19665 1727204191.79182: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 19665 1727204191.79185: variable 'ansible_facts' from source: unknown 19665 1727204191.79722: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204191.7509627-23093-138520870585634/AnsiballZ_stat.py 19665 1727204191.79773: Sending initial data 19665 1727204191.79781: Sent initial data (153 bytes) 19665 1727204191.81584: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204191.81621: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204191.81690: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204191.81696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204191.81756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204191.83590: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204191.83634: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204191.83677: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpdyeoax1_ /root/.ansible/tmp/ansible-tmp-1727204191.7509627-23093-138520870585634/AnsiballZ_stat.py <<< 19665 1727204191.83713: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204191.84936: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204191.85195: stderr chunk (state=3): >>><<< 19665 1727204191.85198: stdout chunk (state=3): >>><<< 19665 1727204191.85201: done transferring module to remote 19665 1727204191.85203: _low_level_execute_command(): starting 19665 1727204191.85205: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204191.7509627-23093-138520870585634/ /root/.ansible/tmp/ansible-tmp-1727204191.7509627-23093-138520870585634/AnsiballZ_stat.py && sleep 0' 19665 1727204191.86112: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204191.86526: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204191.86575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204191.86612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204191.86643: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204191.86652: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204191.86662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204191.86682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204191.86685: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204191.86693: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204191.86696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204191.86706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204191.86727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204191.86733: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204191.86740: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204191.86758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204191.86822: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204191.86838: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204191.86852: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204191.86931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204191.88790: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204191.88949: stderr chunk (state=3): >>><<< 19665 1727204191.88970: stdout chunk (state=3): >>><<< 19665 1727204191.89251: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204191.89255: _low_level_execute_command(): starting 19665 1727204191.89257: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204191.7509627-23093-138520870585634/AnsiballZ_stat.py && sleep 0' 19665 1727204191.89914: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204191.89942: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204191.89966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204191.89992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204191.90055: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204191.90070: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204191.90084: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204191.90115: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204191.90129: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204191.90150: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204191.90176: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204191.90198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204191.90221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204191.90248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204191.90272: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204191.90295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204191.90367: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204191.90396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204191.90417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204191.90779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204192.04222: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} <<< 19665 1727204192.05283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204192.05353: stderr chunk (state=3): >>><<< 19665 1727204192.05357: stdout chunk (state=3): >>><<< 19665 1727204192.05375: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/LSR-TST-br31", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204192.05398: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/LSR-TST-br31', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204191.7509627-23093-138520870585634/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204192.05442: _low_level_execute_command(): starting 19665 1727204192.05445: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204191.7509627-23093-138520870585634/ > /dev/null 2>&1 && sleep 0' 19665 1727204192.05938: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204192.05949: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204192.05995: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204192.06025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration <<< 19665 1727204192.06033: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204192.06137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204192.06141: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204192.06174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204192.06296: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204192.08101: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204192.08166: stderr chunk (state=3): >>><<< 19665 1727204192.08193: stdout chunk (state=3): >>><<< 19665 1727204192.08212: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204192.08231: handler run complete 19665 1727204192.08256: attempt loop complete, returning result 19665 1727204192.08272: _execute() done 19665 1727204192.08274: dumping result to json 19665 1727204192.08276: done dumping result, returning 19665 1727204192.08278: done running TaskExecutor() for managed-node3/TASK: Get stat for interface LSR-TST-br31 [0affcd87-79f5-0dcc-3ea6-0000000004e1] 19665 1727204192.08280: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000004e1 19665 1727204192.08508: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000004e1 19665 1727204192.08511: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 19665 1727204192.08605: no more pending results, returning what we have 19665 1727204192.08608: results queue empty 19665 1727204192.08609: checking for any_errors_fatal 19665 1727204192.08611: done checking for any_errors_fatal 19665 1727204192.08612: checking for max_fail_percentage 19665 1727204192.08613: done checking for max_fail_percentage 19665 1727204192.08614: checking to see if all hosts have failed and the running result is not ok 19665 1727204192.08615: done checking to see if all hosts have failed 19665 1727204192.08615: getting the remaining hosts for this loop 19665 1727204192.08625: done getting the remaining hosts for this loop 19665 1727204192.08629: getting the next task for host managed-node3 19665 1727204192.08636: done getting next task for host managed-node3 19665 1727204192.08639: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 19665 1727204192.08641: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204192.08647: getting variables 19665 1727204192.08648: in VariableManager get_vars() 19665 1727204192.08728: Calling all_inventory to load vars for managed-node3 19665 1727204192.08732: Calling groups_inventory to load vars for managed-node3 19665 1727204192.08735: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204192.08757: Calling all_plugins_play to load vars for managed-node3 19665 1727204192.08762: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204192.08768: Calling groups_plugins_play to load vars for managed-node3 19665 1727204192.10058: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204192.11496: done with get_vars() 19665 1727204192.11524: done getting variables 19665 1727204192.11576: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 19665 1727204192.11683: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'LSR-TST-br31'] ******************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:56:32 -0400 (0:00:00.422) 0:00:42.983 ***** 19665 1727204192.11708: entering _queue_task() for managed-node3/assert 19665 1727204192.13346: worker is 1 (out of 1 available) 19665 1727204192.13445: exiting _queue_task() for managed-node3/assert 19665 1727204192.13486: done queuing things up, now waiting for results queue to drain 19665 1727204192.13488: waiting for pending results... 19665 1727204192.13534: running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'LSR-TST-br31' 19665 1727204192.13620: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000004d7 19665 1727204192.13636: variable 'ansible_search_path' from source: unknown 19665 1727204192.13651: variable 'ansible_search_path' from source: unknown 19665 1727204192.13673: calling self._execute() 19665 1727204192.13786: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204192.13794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204192.13802: variable 'omit' from source: magic vars 19665 1727204192.14118: variable 'ansible_distribution_major_version' from source: facts 19665 1727204192.14139: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204192.14152: variable 'omit' from source: magic vars 19665 1727204192.14199: variable 'omit' from source: magic vars 19665 1727204192.14304: variable 'interface' from source: set_fact 19665 1727204192.14325: variable 'omit' from source: magic vars 19665 1727204192.14372: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204192.14411: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204192.14440: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204192.14458: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204192.14469: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204192.14506: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204192.14509: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204192.14512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204192.14602: Set connection var ansible_connection to ssh 19665 1727204192.14609: Set connection var ansible_shell_type to sh 19665 1727204192.14616: Set connection var ansible_timeout to 10 19665 1727204192.14620: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204192.14631: Set connection var ansible_shell_executable to /bin/sh 19665 1727204192.14637: Set connection var ansible_pipelining to False 19665 1727204192.14667: variable 'ansible_shell_executable' from source: unknown 19665 1727204192.14670: variable 'ansible_connection' from source: unknown 19665 1727204192.14673: variable 'ansible_module_compression' from source: unknown 19665 1727204192.14678: variable 'ansible_shell_type' from source: unknown 19665 1727204192.14693: variable 'ansible_shell_executable' from source: unknown 19665 1727204192.14696: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204192.14698: variable 'ansible_pipelining' from source: unknown 19665 1727204192.14700: variable 'ansible_timeout' from source: unknown 19665 1727204192.14702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204192.14801: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204192.14827: variable 'omit' from source: magic vars 19665 1727204192.14833: starting attempt loop 19665 1727204192.14837: running the handler 19665 1727204192.14987: variable 'interface_stat' from source: set_fact 19665 1727204192.14990: Evaluated conditional (not interface_stat.stat.exists): True 19665 1727204192.14992: handler run complete 19665 1727204192.15008: attempt loop complete, returning result 19665 1727204192.15019: _execute() done 19665 1727204192.15026: dumping result to json 19665 1727204192.15031: done dumping result, returning 19665 1727204192.15040: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is absent - 'LSR-TST-br31' [0affcd87-79f5-0dcc-3ea6-0000000004d7] 19665 1727204192.15051: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000004d7 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 19665 1727204192.15328: no more pending results, returning what we have 19665 1727204192.15332: results queue empty 19665 1727204192.15333: checking for any_errors_fatal 19665 1727204192.15344: done checking for any_errors_fatal 19665 1727204192.15345: checking for max_fail_percentage 19665 1727204192.15347: done checking for max_fail_percentage 19665 1727204192.15347: checking to see if all hosts have failed and the running result is not ok 19665 1727204192.15348: done checking to see if all hosts have failed 19665 1727204192.15349: getting the remaining hosts for this loop 19665 1727204192.15352: done getting the remaining hosts for this loop 19665 1727204192.15356: getting the next task for host managed-node3 19665 1727204192.15366: done getting next task for host managed-node3 19665 1727204192.15368: ^ task is: TASK: meta (flush_handlers) 19665 1727204192.15370: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204192.15373: getting variables 19665 1727204192.15375: in VariableManager get_vars() 19665 1727204192.15857: Calling all_inventory to load vars for managed-node3 19665 1727204192.15860: Calling groups_inventory to load vars for managed-node3 19665 1727204192.15866: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204192.15871: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000004d7 19665 1727204192.15874: WORKER PROCESS EXITING 19665 1727204192.15883: Calling all_plugins_play to load vars for managed-node3 19665 1727204192.15887: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204192.15890: Calling groups_plugins_play to load vars for managed-node3 19665 1727204192.17315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204192.19153: done with get_vars() 19665 1727204192.19181: done getting variables 19665 1727204192.19258: in VariableManager get_vars() 19665 1727204192.19272: Calling all_inventory to load vars for managed-node3 19665 1727204192.19274: Calling groups_inventory to load vars for managed-node3 19665 1727204192.19277: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204192.19282: Calling all_plugins_play to load vars for managed-node3 19665 1727204192.19284: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204192.19287: Calling groups_plugins_play to load vars for managed-node3 19665 1727204192.20458: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204192.22157: done with get_vars() 19665 1727204192.22190: done queuing things up, now waiting for results queue to drain 19665 1727204192.22192: results queue empty 19665 1727204192.22193: checking for any_errors_fatal 19665 1727204192.22196: done checking for any_errors_fatal 19665 1727204192.22197: checking for max_fail_percentage 19665 1727204192.22198: done checking for max_fail_percentage 19665 1727204192.22199: checking to see if all hosts have failed and the running result is not ok 19665 1727204192.22200: done checking to see if all hosts have failed 19665 1727204192.22206: getting the remaining hosts for this loop 19665 1727204192.22207: done getting the remaining hosts for this loop 19665 1727204192.22209: getting the next task for host managed-node3 19665 1727204192.22214: done getting next task for host managed-node3 19665 1727204192.22215: ^ task is: TASK: meta (flush_handlers) 19665 1727204192.22217: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204192.22219: getting variables 19665 1727204192.22220: in VariableManager get_vars() 19665 1727204192.22229: Calling all_inventory to load vars for managed-node3 19665 1727204192.22232: Calling groups_inventory to load vars for managed-node3 19665 1727204192.22234: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204192.22240: Calling all_plugins_play to load vars for managed-node3 19665 1727204192.22245: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204192.22249: Calling groups_plugins_play to load vars for managed-node3 19665 1727204192.23547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204192.25181: done with get_vars() 19665 1727204192.25204: done getting variables 19665 1727204192.25255: in VariableManager get_vars() 19665 1727204192.25266: Calling all_inventory to load vars for managed-node3 19665 1727204192.25268: Calling groups_inventory to load vars for managed-node3 19665 1727204192.25270: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204192.25275: Calling all_plugins_play to load vars for managed-node3 19665 1727204192.25277: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204192.25279: Calling groups_plugins_play to load vars for managed-node3 19665 1727204192.26485: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204192.28128: done with get_vars() 19665 1727204192.28159: done queuing things up, now waiting for results queue to drain 19665 1727204192.28162: results queue empty 19665 1727204192.28163: checking for any_errors_fatal 19665 1727204192.28165: done checking for any_errors_fatal 19665 1727204192.28166: checking for max_fail_percentage 19665 1727204192.28167: done checking for max_fail_percentage 19665 1727204192.28168: checking to see if all hosts have failed and the running result is not ok 19665 1727204192.28169: done checking to see if all hosts have failed 19665 1727204192.28169: getting the remaining hosts for this loop 19665 1727204192.28170: done getting the remaining hosts for this loop 19665 1727204192.28173: getting the next task for host managed-node3 19665 1727204192.28177: done getting next task for host managed-node3 19665 1727204192.28177: ^ task is: None 19665 1727204192.28179: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204192.28180: done queuing things up, now waiting for results queue to drain 19665 1727204192.28181: results queue empty 19665 1727204192.28182: checking for any_errors_fatal 19665 1727204192.28182: done checking for any_errors_fatal 19665 1727204192.28183: checking for max_fail_percentage 19665 1727204192.28184: done checking for max_fail_percentage 19665 1727204192.28184: checking to see if all hosts have failed and the running result is not ok 19665 1727204192.28185: done checking to see if all hosts have failed 19665 1727204192.28186: getting the next task for host managed-node3 19665 1727204192.28189: done getting next task for host managed-node3 19665 1727204192.28189: ^ task is: None 19665 1727204192.28191: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204192.28230: in VariableManager get_vars() 19665 1727204192.28247: done with get_vars() 19665 1727204192.28253: in VariableManager get_vars() 19665 1727204192.28263: done with get_vars() 19665 1727204192.28269: variable 'omit' from source: magic vars 19665 1727204192.28301: in VariableManager get_vars() 19665 1727204192.28310: done with get_vars() 19665 1727204192.28332: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 19665 1727204192.28517: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 19665 1727204192.28541: getting the remaining hosts for this loop 19665 1727204192.28545: done getting the remaining hosts for this loop 19665 1727204192.28548: getting the next task for host managed-node3 19665 1727204192.28550: done getting next task for host managed-node3 19665 1727204192.28552: ^ task is: TASK: Gathering Facts 19665 1727204192.28554: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204192.28556: getting variables 19665 1727204192.28557: in VariableManager get_vars() 19665 1727204192.28567: Calling all_inventory to load vars for managed-node3 19665 1727204192.28569: Calling groups_inventory to load vars for managed-node3 19665 1727204192.28571: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204192.28576: Calling all_plugins_play to load vars for managed-node3 19665 1727204192.28579: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204192.28582: Calling groups_plugins_play to load vars for managed-node3 19665 1727204192.29898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204192.31562: done with get_vars() 19665 1727204192.31590: done getting variables 19665 1727204192.31635: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 Tuesday 24 September 2024 14:56:32 -0400 (0:00:00.199) 0:00:43.183 ***** 19665 1727204192.31661: entering _queue_task() for managed-node3/gather_facts 19665 1727204192.31981: worker is 1 (out of 1 available) 19665 1727204192.31993: exiting _queue_task() for managed-node3/gather_facts 19665 1727204192.32004: done queuing things up, now waiting for results queue to drain 19665 1727204192.32006: waiting for pending results... 19665 1727204192.32293: running TaskExecutor() for managed-node3/TASK: Gathering Facts 19665 1727204192.32413: in run() - task 0affcd87-79f5-0dcc-3ea6-0000000004fa 19665 1727204192.32437: variable 'ansible_search_path' from source: unknown 19665 1727204192.32489: calling self._execute() 19665 1727204192.32597: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204192.32609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204192.32625: variable 'omit' from source: magic vars 19665 1727204192.33050: variable 'ansible_distribution_major_version' from source: facts 19665 1727204192.33073: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204192.33083: variable 'omit' from source: magic vars 19665 1727204192.33120: variable 'omit' from source: magic vars 19665 1727204192.33169: variable 'omit' from source: magic vars 19665 1727204192.33220: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204192.33267: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204192.33296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204192.33320: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204192.33338: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204192.33377: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204192.33386: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204192.33395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204192.33504: Set connection var ansible_connection to ssh 19665 1727204192.33517: Set connection var ansible_shell_type to sh 19665 1727204192.33529: Set connection var ansible_timeout to 10 19665 1727204192.33541: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204192.33558: Set connection var ansible_shell_executable to /bin/sh 19665 1727204192.33573: Set connection var ansible_pipelining to False 19665 1727204192.33601: variable 'ansible_shell_executable' from source: unknown 19665 1727204192.33608: variable 'ansible_connection' from source: unknown 19665 1727204192.33614: variable 'ansible_module_compression' from source: unknown 19665 1727204192.33623: variable 'ansible_shell_type' from source: unknown 19665 1727204192.33630: variable 'ansible_shell_executable' from source: unknown 19665 1727204192.33635: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204192.33645: variable 'ansible_pipelining' from source: unknown 19665 1727204192.33654: variable 'ansible_timeout' from source: unknown 19665 1727204192.33666: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204192.33856: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204192.33877: variable 'omit' from source: magic vars 19665 1727204192.33889: starting attempt loop 19665 1727204192.33896: running the handler 19665 1727204192.33916: variable 'ansible_facts' from source: unknown 19665 1727204192.33941: _low_level_execute_command(): starting 19665 1727204192.33956: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204192.34724: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204192.34740: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204192.34766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204192.34790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204192.34839: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204192.34857: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204192.34877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204192.34897: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204192.34909: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204192.34920: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204192.34933: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204192.34950: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204192.34971: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204192.34985: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204192.34997: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204192.35010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204192.35094: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204192.35111: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204192.35126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204192.35228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204192.36855: stdout chunk (state=3): >>>/root <<< 19665 1727204192.36947: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204192.37047: stderr chunk (state=3): >>><<< 19665 1727204192.37061: stdout chunk (state=3): >>><<< 19665 1727204192.37202: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204192.37206: _low_level_execute_command(): starting 19665 1727204192.37209: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204192.371022-23116-193943023115650 `" && echo ansible-tmp-1727204192.371022-23116-193943023115650="` echo /root/.ansible/tmp/ansible-tmp-1727204192.371022-23116-193943023115650 `" ) && sleep 0' 19665 1727204192.38014: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204192.38031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204192.38050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204192.38172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204192.38260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204192.38279: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204192.38304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204192.38331: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204192.38348: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204192.38361: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204192.38377: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204192.38390: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204192.38404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204192.38416: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204192.38427: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204192.38439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204192.38532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204192.38536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204192.38543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204192.38610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204192.40432: stdout chunk (state=3): >>>ansible-tmp-1727204192.371022-23116-193943023115650=/root/.ansible/tmp/ansible-tmp-1727204192.371022-23116-193943023115650 <<< 19665 1727204192.40634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204192.40638: stdout chunk (state=3): >>><<< 19665 1727204192.40640: stderr chunk (state=3): >>><<< 19665 1727204192.40774: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204192.371022-23116-193943023115650=/root/.ansible/tmp/ansible-tmp-1727204192.371022-23116-193943023115650 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204192.40778: variable 'ansible_module_compression' from source: unknown 19665 1727204192.40780: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 19665 1727204192.40883: variable 'ansible_facts' from source: unknown 19665 1727204192.41027: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204192.371022-23116-193943023115650/AnsiballZ_setup.py 19665 1727204192.42019: Sending initial data 19665 1727204192.42023: Sent initial data (153 bytes) 19665 1727204192.44292: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204192.44297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204192.44308: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204192.44316: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204192.44327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204192.44341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204192.44349: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204192.44356: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204192.44366: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204192.44376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204192.44387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204192.44394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204192.44401: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204192.44412: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204192.44484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204192.44502: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204192.44515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204192.44587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204192.46350: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204192.46392: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204192.46433: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpee9hvyul /root/.ansible/tmp/ansible-tmp-1727204192.371022-23116-193943023115650/AnsiballZ_setup.py <<< 19665 1727204192.46470: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204192.49503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204192.49589: stderr chunk (state=3): >>><<< 19665 1727204192.49592: stdout chunk (state=3): >>><<< 19665 1727204192.49616: done transferring module to remote 19665 1727204192.49628: _low_level_execute_command(): starting 19665 1727204192.49631: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204192.371022-23116-193943023115650/ /root/.ansible/tmp/ansible-tmp-1727204192.371022-23116-193943023115650/AnsiballZ_setup.py && sleep 0' 19665 1727204192.51828: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204192.52686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204192.52696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204192.52710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204192.52755: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204192.52763: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204192.52776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204192.52788: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204192.52795: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204192.52801: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204192.52809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204192.52818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204192.52830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204192.52837: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204192.52846: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204192.52852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204192.52927: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204192.52948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204192.52957: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204192.53031: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204192.54871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204192.54875: stdout chunk (state=3): >>><<< 19665 1727204192.54882: stderr chunk (state=3): >>><<< 19665 1727204192.54899: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204192.54904: _low_level_execute_command(): starting 19665 1727204192.54906: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204192.371022-23116-193943023115650/AnsiballZ_setup.py && sleep 0' 19665 1727204192.55749: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204192.55757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204192.55769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204192.55785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204192.55822: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204192.55828: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204192.55836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204192.55850: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204192.55857: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204192.55866: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204192.55875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204192.55885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204192.55898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204192.55904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204192.55912: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204192.55921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204192.55992: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204192.56011: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204192.56023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204192.56100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204193.06403: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.42, "5m": 0.35, "15m": 0.18}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2807, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 725, "free": 2807}, "nocache": {"free": 3266, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 538, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282144768, "block_size": 4096, "block_total": 65519355, "block_available": 64522008, "block_used": 997347, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "33", "epoch": "1727204193", "epoch_int": "1727204193", "date": "2024-09-24", "time": "14:56:33", "iso8601_micro": "2024-09-24T18:56:33.023536Z", "iso8601": "2024-09-24T18:56:33Z", "iso8601_basic": "20240924T145633023536", "iso8601_basic_short": "20240924T145633", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", <<< 19665 1727204193.06447: stdout chunk (state=3): >>>"tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 19665 1727204193.08087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204193.08178: stderr chunk (state=3): >>><<< 19665 1727204193.08182: stdout chunk (state=3): >>><<< 19665 1727204193.08476: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "5.14.0-511.el9.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 19 06:52:39 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.9.19", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "17639b67ac7f4f0eaf69642a93854be7", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAKPEkaFEOfyLRWf/ytDK/ece4HG9Vs7QRYRKiqVrxVfx/uC7z/xpjkTjz2e/reN9chL0uYXfAUHLT5zQizp+wHj01l7h7BmeEa5FLpqDn3aSco5OeZQT93bt+RqBhVagysRC7yYbxsta2AJSQ91RtsoaLd9hw2arIX0pjeqh9JnVAAAAFQDYE8eGyVKl3GWR/vJ5nBDRF/STXQAAAIAkRCSeh2d0zA4D4eGHZKDjisvN6MPvspZOngRY05qRIEPhkvMFP8YJVo+RD+0sYMqbWwEPB/8eQ5uKfzvIEVFCoDfKXjbfekcGRkLB9GfovuNGyTHNz4Y37wwFAT5EZ+5KXbU+PGP80ZmfaRhtVKgjveNuP/5vN2fFTXHzdE51fgAAAIAJvTztR3w6AKEg6SJxYbrLm5rtoQjt1Hclpz3Tvm4gEvwhK5ewDrJqfJoFaxwuX7GnJbq+91neTbl4ZfjpQ5z+1RMpjBoQkG1bJkkMNtVmQ0ezCkW5kcC3To+zodlDP3aqBZVBpTbfFJnwluh5TJbXmylLNlbSFzm8WuANbYW16A==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCStxbMVDo05qvbxwmf+gQSUB/l38jNPH28+h+LZuyYc9QOaAucvcy4WXyiRNMka8l5+4Zlm8BtWYOw75Yhj6ZSXb3MIreZ6EF9sxUt8FHgPbBB+KYaZq2naZ+rTqEJYh+4WAckdrXob8q7vF7CdyfdG6reviM1+XefRlHuC7jkn+pc5mqXsUu2AxkSxrhFoytGwIHdi5s6xFD09xxZRAIPi+kLTa4Del1SdPvV2Gf4e359P4xTH9yCRDq5XbNXK7aYoNMWYnMnbI7qjfJDViaqkydciVGpMVdP3wXxwO2tAL+GBiffx11PbK2L4CZvucTYoa1UNlQmrG7pkmji3AG/8FXhIqKSEOUEvNq8R0tGTsY4jqRTPLT6z89wbgV24t96J1q4swQafiMbv3bxpjqVlaxT8BxtNIK0t4SwoezsdTsLezhhAVF8lGQ2rbT1IPqaB9Ozs3GpLJGvKuNWfLm4W2DNPeAZvmTF2ZhCxmERxZOTEL2a3r2sShhZL7VT0ms=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJdOgKJEZAcWhWWhm0URntCw5IWTaPfzgxU4WxT42VMKpe5IjXefD56B7mCVtWDJqr8WBwrNK5BxR3ujZ2UzVvM=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAINrYRUtH6QyTpsgcsx30FuMNOymnkP0V0KNL9DpYDfGO", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.42, "5m": 0.35, "15m": 0.18}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,msdos1)/boot/vmlinuz-5.14.0-511.el9.x86_64", "root": "UUID=ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.9", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.14.85 53286 10.31.15.87 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.14.85 53286 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "which_declare": "declare -f", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0", "BASH_FUNC_which%%": "() { ( alias;\n eval ${which_declare} ) | /usr/bin/which --tty-only --read-alias --read-functions --show-tilde --show-dot $@\n}"}, "ansible_iscsi_iqn": "", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:d5aef1ea-3141-48ae-bf33-0c6b351dd422", "ansible_local": {}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "9", "ansible_distribution_major_version": "9", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3532, "ansible_memfree_mb": 2807, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3532, "used": 725, "free": 2807}, "nocache": {"free": 3266, "used": 266}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_uuid": "ec2ceb79-bfdf-2ab3-fbd4-199887493eb4", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda1": {"links": {"ids": [], "uuids": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"], "labels": [], "masters": []}, "start": "2048", "sectors": "524285919", "sectorsize": 512, "size": "250.00 GB", "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae", "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda1": ["ad406aa3-aab4-4a6a-aa73-3e870a6316ae"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 538, "ansible_lvm": "N/A", "ansible_mounts": [{"mount": "/", "device": "/dev/xvda1", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268367278080, "size_available": 264282144768, "block_size": 4096, "block_total": 65519355, "block_available": 64522008, "block_used": 997347, "inode_total": 131071472, "inode_available": 130998312, "inode_used": 73160, "uuid": "ad406aa3-aab4-4a6a-aa73-3e870a6316ae"}], "ansible_python": {"version": {"major": 3, "minor": 9, "micro": 19, "releaselevel": "final", "serial": 0}, "version_info": [3, 9, 19, "final", 0], "executable": "/usr/bin/python3.9", "has_sslcontext": true, "type": "cpython"}, "ansible_fips": false, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "56", "second": "33", "epoch": "1727204193", "epoch_int": "1727204193", "date": "2024-09-24", "time": "14:56:33", "iso8601_micro": "2024-09-24T18:56:33.023536Z", "iso8601": "2024-09-24T18:56:33Z", "iso8601_basic": "20240924T145633023536", "iso8601_basic_short": "20240924T145633", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22"}, "ipv6": [{"address": "fe80::8ff:f5ff:fed7:be93", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.12.1", "interface": "eth0", "address": "10.31.15.87", "broadcast": "10.31.15.255", "netmask": "255.255.252.0", "network": "10.31.12.0", "prefix": "22", "macaddress": "0a:ff:f5:d7:be:93", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.15.87"], "ansible_all_ipv6_addresses": ["fe80::8ff:f5ff:fed7:be93"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.15.87", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::8ff:f5ff:fed7:be93"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204193.08621: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204192.371022-23116-193943023115650/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204193.08652: _low_level_execute_command(): starting 19665 1727204193.08661: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204192.371022-23116-193943023115650/ > /dev/null 2>&1 && sleep 0' 19665 1727204193.09422: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204193.09435: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.09450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.09474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.09570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.09620: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204193.09641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.09672: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204193.09716: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204193.09729: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204193.09771: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.09816: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.09850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.09862: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.09894: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204193.09934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.10053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204193.10098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204193.10132: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204193.10222: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204193.12037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204193.12093: stderr chunk (state=3): >>><<< 19665 1727204193.12096: stdout chunk (state=3): >>><<< 19665 1727204193.12560: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204193.12565: handler run complete 19665 1727204193.12568: variable 'ansible_facts' from source: unknown 19665 1727204193.12571: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204193.12734: variable 'ansible_facts' from source: unknown 19665 1727204193.12818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204193.12928: attempt loop complete, returning result 19665 1727204193.12937: _execute() done 19665 1727204193.12947: dumping result to json 19665 1727204193.12983: done dumping result, returning 19665 1727204193.13076: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [0affcd87-79f5-0dcc-3ea6-0000000004fa] 19665 1727204193.13087: sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000004fa ok: [managed-node3] 19665 1727204193.13893: no more pending results, returning what we have 19665 1727204193.13896: results queue empty 19665 1727204193.13897: checking for any_errors_fatal 19665 1727204193.13899: done checking for any_errors_fatal 19665 1727204193.13899: checking for max_fail_percentage 19665 1727204193.13901: done checking for max_fail_percentage 19665 1727204193.13902: checking to see if all hosts have failed and the running result is not ok 19665 1727204193.13902: done checking to see if all hosts have failed 19665 1727204193.13903: getting the remaining hosts for this loop 19665 1727204193.13904: done getting the remaining hosts for this loop 19665 1727204193.13907: getting the next task for host managed-node3 19665 1727204193.13912: done getting next task for host managed-node3 19665 1727204193.13914: ^ task is: TASK: meta (flush_handlers) 19665 1727204193.13916: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204193.13919: getting variables 19665 1727204193.13920: in VariableManager get_vars() 19665 1727204193.13942: Calling all_inventory to load vars for managed-node3 19665 1727204193.13944: Calling groups_inventory to load vars for managed-node3 19665 1727204193.13947: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204193.13955: done sending task result for task 0affcd87-79f5-0dcc-3ea6-0000000004fa 19665 1727204193.13958: WORKER PROCESS EXITING 19665 1727204193.13970: Calling all_plugins_play to load vars for managed-node3 19665 1727204193.13973: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204193.13977: Calling groups_plugins_play to load vars for managed-node3 19665 1727204193.16520: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204193.18614: done with get_vars() 19665 1727204193.18650: done getting variables 19665 1727204193.18726: in VariableManager get_vars() 19665 1727204193.18737: Calling all_inventory to load vars for managed-node3 19665 1727204193.18740: Calling groups_inventory to load vars for managed-node3 19665 1727204193.18745: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204193.18751: Calling all_plugins_play to load vars for managed-node3 19665 1727204193.18753: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204193.18762: Calling groups_plugins_play to load vars for managed-node3 19665 1727204193.20016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204193.21730: done with get_vars() 19665 1727204193.21771: done queuing things up, now waiting for results queue to drain 19665 1727204193.21773: results queue empty 19665 1727204193.21774: checking for any_errors_fatal 19665 1727204193.21779: done checking for any_errors_fatal 19665 1727204193.21780: checking for max_fail_percentage 19665 1727204193.21781: done checking for max_fail_percentage 19665 1727204193.21782: checking to see if all hosts have failed and the running result is not ok 19665 1727204193.21782: done checking to see if all hosts have failed 19665 1727204193.21783: getting the remaining hosts for this loop 19665 1727204193.21784: done getting the remaining hosts for this loop 19665 1727204193.21787: getting the next task for host managed-node3 19665 1727204193.21791: done getting next task for host managed-node3 19665 1727204193.21793: ^ task is: TASK: Verify network state restored to default 19665 1727204193.21795: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204193.21798: getting variables 19665 1727204193.21799: in VariableManager get_vars() 19665 1727204193.21810: Calling all_inventory to load vars for managed-node3 19665 1727204193.21812: Calling groups_inventory to load vars for managed-node3 19665 1727204193.21815: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204193.21820: Calling all_plugins_play to load vars for managed-node3 19665 1727204193.21823: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204193.21826: Calling groups_plugins_play to load vars for managed-node3 19665 1727204193.23145: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204193.24771: done with get_vars() 19665 1727204193.24797: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:67 Tuesday 24 September 2024 14:56:33 -0400 (0:00:00.932) 0:00:44.115 ***** 19665 1727204193.24880: entering _queue_task() for managed-node3/include_tasks 19665 1727204193.25209: worker is 1 (out of 1 available) 19665 1727204193.25223: exiting _queue_task() for managed-node3/include_tasks 19665 1727204193.25235: done queuing things up, now waiting for results queue to drain 19665 1727204193.25236: waiting for pending results... 19665 1727204193.25519: running TaskExecutor() for managed-node3/TASK: Verify network state restored to default 19665 1727204193.25628: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000007a 19665 1727204193.25649: variable 'ansible_search_path' from source: unknown 19665 1727204193.25692: calling self._execute() 19665 1727204193.25794: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204193.25804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204193.25816: variable 'omit' from source: magic vars 19665 1727204193.26194: variable 'ansible_distribution_major_version' from source: facts 19665 1727204193.26211: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204193.26225: _execute() done 19665 1727204193.26232: dumping result to json 19665 1727204193.26239: done dumping result, returning 19665 1727204193.26251: done running TaskExecutor() for managed-node3/TASK: Verify network state restored to default [0affcd87-79f5-0dcc-3ea6-00000000007a] 19665 1727204193.26262: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000007a 19665 1727204193.26388: no more pending results, returning what we have 19665 1727204193.26393: in VariableManager get_vars() 19665 1727204193.26428: Calling all_inventory to load vars for managed-node3 19665 1727204193.26431: Calling groups_inventory to load vars for managed-node3 19665 1727204193.26435: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204193.26451: Calling all_plugins_play to load vars for managed-node3 19665 1727204193.26455: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204193.26459: Calling groups_plugins_play to load vars for managed-node3 19665 1727204193.27582: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000007a 19665 1727204193.27585: WORKER PROCESS EXITING 19665 1727204193.28170: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204193.29994: done with get_vars() 19665 1727204193.30015: variable 'ansible_search_path' from source: unknown 19665 1727204193.30032: we have included files to process 19665 1727204193.30034: generating all_blocks data 19665 1727204193.30035: done generating all_blocks data 19665 1727204193.30036: processing included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 19665 1727204193.30037: loading included file: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 19665 1727204193.30039: Loading data from /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 19665 1727204193.30480: done processing included file 19665 1727204193.30482: iterating over new_blocks loaded from include file 19665 1727204193.30484: in VariableManager get_vars() 19665 1727204193.30497: done with get_vars() 19665 1727204193.30498: filtering new block on tags 19665 1727204193.30516: done filtering new block on tags 19665 1727204193.30518: done iterating over new_blocks loaded from include file included: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node3 19665 1727204193.30524: extending task lists for all hosts with included blocks 19665 1727204193.30559: done extending task lists 19665 1727204193.30560: done processing included files 19665 1727204193.30561: results queue empty 19665 1727204193.30562: checking for any_errors_fatal 19665 1727204193.30565: done checking for any_errors_fatal 19665 1727204193.30566: checking for max_fail_percentage 19665 1727204193.30567: done checking for max_fail_percentage 19665 1727204193.30568: checking to see if all hosts have failed and the running result is not ok 19665 1727204193.30569: done checking to see if all hosts have failed 19665 1727204193.30570: getting the remaining hosts for this loop 19665 1727204193.30571: done getting the remaining hosts for this loop 19665 1727204193.30574: getting the next task for host managed-node3 19665 1727204193.30578: done getting next task for host managed-node3 19665 1727204193.30580: ^ task is: TASK: Check routes and DNS 19665 1727204193.30582: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204193.30585: getting variables 19665 1727204193.30586: in VariableManager get_vars() 19665 1727204193.30594: Calling all_inventory to load vars for managed-node3 19665 1727204193.30596: Calling groups_inventory to load vars for managed-node3 19665 1727204193.30598: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204193.30604: Calling all_plugins_play to load vars for managed-node3 19665 1727204193.30606: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204193.30608: Calling groups_plugins_play to load vars for managed-node3 19665 1727204193.31861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204193.33515: done with get_vars() 19665 1727204193.33549: done getting variables 19665 1727204193.33598: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:56:33 -0400 (0:00:00.087) 0:00:44.202 ***** 19665 1727204193.33632: entering _queue_task() for managed-node3/shell 19665 1727204193.33985: worker is 1 (out of 1 available) 19665 1727204193.33997: exiting _queue_task() for managed-node3/shell 19665 1727204193.34009: done queuing things up, now waiting for results queue to drain 19665 1727204193.34011: waiting for pending results... 19665 1727204193.34301: running TaskExecutor() for managed-node3/TASK: Check routes and DNS 19665 1727204193.34431: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000050b 19665 1727204193.34458: variable 'ansible_search_path' from source: unknown 19665 1727204193.34470: variable 'ansible_search_path' from source: unknown 19665 1727204193.34513: calling self._execute() 19665 1727204193.34612: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204193.34624: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204193.34638: variable 'omit' from source: magic vars 19665 1727204193.35041: variable 'ansible_distribution_major_version' from source: facts 19665 1727204193.35063: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204193.35077: variable 'omit' from source: magic vars 19665 1727204193.35125: variable 'omit' from source: magic vars 19665 1727204193.35172: variable 'omit' from source: magic vars 19665 1727204193.35224: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204193.35271: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204193.35299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204193.35327: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204193.35346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204193.35382: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204193.35392: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204193.35400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204193.35504: Set connection var ansible_connection to ssh 19665 1727204193.35517: Set connection var ansible_shell_type to sh 19665 1727204193.35529: Set connection var ansible_timeout to 10 19665 1727204193.35548: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204193.35562: Set connection var ansible_shell_executable to /bin/sh 19665 1727204193.35578: Set connection var ansible_pipelining to False 19665 1727204193.35605: variable 'ansible_shell_executable' from source: unknown 19665 1727204193.35613: variable 'ansible_connection' from source: unknown 19665 1727204193.35622: variable 'ansible_module_compression' from source: unknown 19665 1727204193.35630: variable 'ansible_shell_type' from source: unknown 19665 1727204193.35637: variable 'ansible_shell_executable' from source: unknown 19665 1727204193.35651: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204193.35660: variable 'ansible_pipelining' from source: unknown 19665 1727204193.35669: variable 'ansible_timeout' from source: unknown 19665 1727204193.35677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204193.35828: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204193.35849: variable 'omit' from source: magic vars 19665 1727204193.35859: starting attempt loop 19665 1727204193.35872: running the handler 19665 1727204193.35887: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204193.35910: _low_level_execute_command(): starting 19665 1727204193.35923: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204193.36730: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204193.36754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.36774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.36795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.36841: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.36859: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204193.36879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.36899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204193.36912: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204193.36924: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204193.36937: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.36956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.36979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.36993: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.37006: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204193.37022: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.37106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204193.37130: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204193.37150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204193.37233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204193.38884: stdout chunk (state=3): >>>/root <<< 19665 1727204193.38986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204193.39470: stderr chunk (state=3): >>><<< 19665 1727204193.39473: stdout chunk (state=3): >>><<< 19665 1727204193.39605: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204193.39609: _low_level_execute_command(): starting 19665 1727204193.39622: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204193.3950102-23158-17579902831473 `" && echo ansible-tmp-1727204193.3950102-23158-17579902831473="` echo /root/.ansible/tmp/ansible-tmp-1727204193.3950102-23158-17579902831473 `" ) && sleep 0' 19665 1727204193.40261: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204193.40279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.40293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.40310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.40358: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.40381: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204193.40400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.40420: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204193.40436: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204193.40453: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204193.40472: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.40496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.40515: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.40528: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.40541: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204193.40558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.40646: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204193.40671: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204193.40687: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204193.40772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204193.42668: stdout chunk (state=3): >>>ansible-tmp-1727204193.3950102-23158-17579902831473=/root/.ansible/tmp/ansible-tmp-1727204193.3950102-23158-17579902831473 <<< 19665 1727204193.42884: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204193.42888: stdout chunk (state=3): >>><<< 19665 1727204193.42891: stderr chunk (state=3): >>><<< 19665 1727204193.43216: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204193.3950102-23158-17579902831473=/root/.ansible/tmp/ansible-tmp-1727204193.3950102-23158-17579902831473 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204193.43220: variable 'ansible_module_compression' from source: unknown 19665 1727204193.43224: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19665 1727204193.43226: variable 'ansible_facts' from source: unknown 19665 1727204193.43229: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204193.3950102-23158-17579902831473/AnsiballZ_command.py 19665 1727204193.43300: Sending initial data 19665 1727204193.43303: Sent initial data (155 bytes) 19665 1727204193.44275: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204193.44290: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.44305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.44322: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.44376: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.44390: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204193.44407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.44418: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204193.44426: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204193.44433: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204193.44441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.44454: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.44467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.44474: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.44481: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204193.44490: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.44567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204193.44586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204193.44597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204193.44674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204193.46465: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204193.46504: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204193.46547: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpeom7kw2w /root/.ansible/tmp/ansible-tmp-1727204193.3950102-23158-17579902831473/AnsiballZ_command.py <<< 19665 1727204193.46584: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204193.47880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204193.48129: stderr chunk (state=3): >>><<< 19665 1727204193.48132: stdout chunk (state=3): >>><<< 19665 1727204193.48135: done transferring module to remote 19665 1727204193.48137: _low_level_execute_command(): starting 19665 1727204193.48139: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204193.3950102-23158-17579902831473/ /root/.ansible/tmp/ansible-tmp-1727204193.3950102-23158-17579902831473/AnsiballZ_command.py && sleep 0' 19665 1727204193.48773: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204193.48787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.48805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.48823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.48872: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.48885: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204193.48902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.48922: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204193.48933: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204193.48946: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204193.48958: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.48973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.48988: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.48999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.49009: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204193.49025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.49106: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204193.49125: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204193.49139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204193.49213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204193.51013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204193.51017: stdout chunk (state=3): >>><<< 19665 1727204193.51019: stderr chunk (state=3): >>><<< 19665 1727204193.51116: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204193.51120: _low_level_execute_command(): starting 19665 1727204193.51123: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204193.3950102-23158-17579902831473/AnsiballZ_command.py && sleep 0' 19665 1727204193.51697: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204193.51712: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.51727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.51745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.51793: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.51806: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204193.51821: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.51839: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204193.51853: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204193.51868: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204193.51882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.51897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.51913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.51926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.51940: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204193.51956: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.52032: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204193.52051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204193.52073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204193.52161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204193.66134: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:f5:d7:be:93 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.87/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3074sec preferred_lft 3074sec\n inet6 fe80::8ff:f5ff:fed7:be93/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:56:33.651624", "end": "2024-09-24 14:56:33.660016", "delta": "0:00:00.008392", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19665 1727204193.67258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204193.67324: stderr chunk (state=3): >>><<< 19665 1727204193.67327: stdout chunk (state=3): >>><<< 19665 1727204193.67371: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 0a:ff:f5:d7:be:93 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.15.87/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0\n valid_lft 3074sec preferred_lft 3074sec\n inet6 fe80::8ff:f5ff:fed7:be93/64 scope link \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 \n10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 \nIP -6 ROUTE\n::1 dev lo proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 256 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:56:33.651624", "end": "2024-09-24 14:56:33.660016", "delta": "0:00:00.008392", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204193.67381: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204193.3950102-23158-17579902831473/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204193.67390: _low_level_execute_command(): starting 19665 1727204193.67394: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204193.3950102-23158-17579902831473/ > /dev/null 2>&1 && sleep 0' 19665 1727204193.67875: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.67879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.67909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.67917: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204193.67931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.67945: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204193.67948: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204193.67956: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204193.67961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.67970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.67979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.67984: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.68046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204193.68052: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204193.68063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204193.68121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204193.69872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204193.69928: stderr chunk (state=3): >>><<< 19665 1727204193.69932: stdout chunk (state=3): >>><<< 19665 1727204193.69948: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204193.69957: handler run complete 19665 1727204193.69975: Evaluated conditional (False): False 19665 1727204193.69984: attempt loop complete, returning result 19665 1727204193.69988: _execute() done 19665 1727204193.69991: dumping result to json 19665 1727204193.69999: done dumping result, returning 19665 1727204193.70008: done running TaskExecutor() for managed-node3/TASK: Check routes and DNS [0affcd87-79f5-0dcc-3ea6-00000000050b] 19665 1727204193.70013: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000050b 19665 1727204193.70113: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000050b 19665 1727204193.70116: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008392", "end": "2024-09-24 14:56:33.660016", "rc": 0, "start": "2024-09-24 14:56:33.651624" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 0a:ff:f5:d7:be:93 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.15.87/22 brd 10.31.15.255 scope global dynamic noprefixroute eth0 valid_lft 3074sec preferred_lft 3074sec inet6 fe80::8ff:f5ff:fed7:be93/64 scope link valid_lft forever preferred_lft forever IP ROUTE default via 10.31.12.1 dev eth0 proto dhcp src 10.31.15.87 metric 100 10.31.12.0/22 dev eth0 proto kernel scope link src 10.31.15.87 metric 100 IP -6 ROUTE ::1 dev lo proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 256 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 19665 1727204193.70187: no more pending results, returning what we have 19665 1727204193.70191: results queue empty 19665 1727204193.70192: checking for any_errors_fatal 19665 1727204193.70194: done checking for any_errors_fatal 19665 1727204193.70194: checking for max_fail_percentage 19665 1727204193.70196: done checking for max_fail_percentage 19665 1727204193.70197: checking to see if all hosts have failed and the running result is not ok 19665 1727204193.70198: done checking to see if all hosts have failed 19665 1727204193.70198: getting the remaining hosts for this loop 19665 1727204193.70200: done getting the remaining hosts for this loop 19665 1727204193.70205: getting the next task for host managed-node3 19665 1727204193.70210: done getting next task for host managed-node3 19665 1727204193.70213: ^ task is: TASK: Verify DNS and network connectivity 19665 1727204193.70217: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204193.70221: getting variables 19665 1727204193.70222: in VariableManager get_vars() 19665 1727204193.70253: Calling all_inventory to load vars for managed-node3 19665 1727204193.70255: Calling groups_inventory to load vars for managed-node3 19665 1727204193.70259: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204193.70271: Calling all_plugins_play to load vars for managed-node3 19665 1727204193.70274: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204193.70277: Calling groups_plugins_play to load vars for managed-node3 19665 1727204193.72318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204193.74741: done with get_vars() 19665 1727204193.74775: done getting variables 19665 1727204193.74825: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:56:33 -0400 (0:00:00.412) 0:00:44.615 ***** 19665 1727204193.74854: entering _queue_task() for managed-node3/shell 19665 1727204193.75100: worker is 1 (out of 1 available) 19665 1727204193.75116: exiting _queue_task() for managed-node3/shell 19665 1727204193.75127: done queuing things up, now waiting for results queue to drain 19665 1727204193.75129: waiting for pending results... 19665 1727204193.75305: running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity 19665 1727204193.75379: in run() - task 0affcd87-79f5-0dcc-3ea6-00000000050c 19665 1727204193.75390: variable 'ansible_search_path' from source: unknown 19665 1727204193.75393: variable 'ansible_search_path' from source: unknown 19665 1727204193.75421: calling self._execute() 19665 1727204193.75497: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204193.75500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204193.75510: variable 'omit' from source: magic vars 19665 1727204193.75795: variable 'ansible_distribution_major_version' from source: facts 19665 1727204193.75806: Evaluated conditional (ansible_distribution_major_version != '6'): True 19665 1727204193.75905: variable 'ansible_facts' from source: unknown 19665 1727204193.76399: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 19665 1727204193.76404: variable 'omit' from source: magic vars 19665 1727204193.76479: variable 'omit' from source: magic vars 19665 1727204193.76522: variable 'omit' from source: magic vars 19665 1727204193.76576: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 19665 1727204193.76627: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 19665 1727204193.76647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 19665 1727204193.77272: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204193.77275: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 19665 1727204193.77277: variable 'inventory_hostname' from source: host vars for 'managed-node3' 19665 1727204193.77279: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204193.77281: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204193.77283: Set connection var ansible_connection to ssh 19665 1727204193.77285: Set connection var ansible_shell_type to sh 19665 1727204193.77287: Set connection var ansible_timeout to 10 19665 1727204193.77289: Set connection var ansible_module_compression to ZIP_DEFLATED 19665 1727204193.77291: Set connection var ansible_shell_executable to /bin/sh 19665 1727204193.77293: Set connection var ansible_pipelining to False 19665 1727204193.77295: variable 'ansible_shell_executable' from source: unknown 19665 1727204193.77297: variable 'ansible_connection' from source: unknown 19665 1727204193.77298: variable 'ansible_module_compression' from source: unknown 19665 1727204193.77301: variable 'ansible_shell_type' from source: unknown 19665 1727204193.77303: variable 'ansible_shell_executable' from source: unknown 19665 1727204193.77305: variable 'ansible_host' from source: host vars for 'managed-node3' 19665 1727204193.77307: variable 'ansible_pipelining' from source: unknown 19665 1727204193.77309: variable 'ansible_timeout' from source: unknown 19665 1727204193.77311: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 19665 1727204193.77314: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204193.77316: variable 'omit' from source: magic vars 19665 1727204193.77318: starting attempt loop 19665 1727204193.77320: running the handler 19665 1727204193.77322: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 19665 1727204193.77324: _low_level_execute_command(): starting 19665 1727204193.77326: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 19665 1727204193.78013: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204193.78029: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.78044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.78068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.78124: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.78136: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204193.78149: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.78170: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204193.78186: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204193.78196: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204193.78220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.78235: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.78251: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.78263: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.78276: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204193.78293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.78385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204193.78411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204193.78432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204193.78505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204193.80043: stdout chunk (state=3): >>>/root <<< 19665 1727204193.80180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204193.80188: stderr chunk (state=3): >>><<< 19665 1727204193.80191: stdout chunk (state=3): >>><<< 19665 1727204193.80212: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204193.80225: _low_level_execute_command(): starting 19665 1727204193.80231: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204193.8021197-23181-20409309183349 `" && echo ansible-tmp-1727204193.8021197-23181-20409309183349="` echo /root/.ansible/tmp/ansible-tmp-1727204193.8021197-23181-20409309183349 `" ) && sleep 0' 19665 1727204193.80699: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.80702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.80740: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.80744: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.80746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.80801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204193.80805: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204193.80852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204193.82665: stdout chunk (state=3): >>>ansible-tmp-1727204193.8021197-23181-20409309183349=/root/.ansible/tmp/ansible-tmp-1727204193.8021197-23181-20409309183349 <<< 19665 1727204193.82779: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204193.82829: stderr chunk (state=3): >>><<< 19665 1727204193.82833: stdout chunk (state=3): >>><<< 19665 1727204193.82852: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204193.8021197-23181-20409309183349=/root/.ansible/tmp/ansible-tmp-1727204193.8021197-23181-20409309183349 , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204193.82881: variable 'ansible_module_compression' from source: unknown 19665 1727204193.82927: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-196652yv5_2fn/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 19665 1727204193.82958: variable 'ansible_facts' from source: unknown 19665 1727204193.83008: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204193.8021197-23181-20409309183349/AnsiballZ_command.py 19665 1727204193.83115: Sending initial data 19665 1727204193.83125: Sent initial data (155 bytes) 19665 1727204193.83794: stderr chunk (state=3): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.83797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.83832: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found <<< 19665 1727204193.83836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.83838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.83896: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204193.83899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204193.83944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204193.85639: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 19665 1727204193.85698: stderr chunk (state=3): >>>debug1: Using server download size 261120 debug1: Using server upload size 261120 debug1: Server handle limit 1019; using 64 <<< 19665 1727204193.85714: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-196652yv5_2fn/tmpd1h9ajo4 /root/.ansible/tmp/ansible-tmp-1727204193.8021197-23181-20409309183349/AnsiballZ_command.py <<< 19665 1727204193.85779: stderr chunk (state=3): >>>debug1: Couldn't stat remote file: No such file or directory <<< 19665 1727204193.87099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204193.87304: stderr chunk (state=3): >>><<< 19665 1727204193.87322: stdout chunk (state=3): >>><<< 19665 1727204193.87458: done transferring module to remote 19665 1727204193.87462: _low_level_execute_command(): starting 19665 1727204193.87464: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204193.8021197-23181-20409309183349/ /root/.ansible/tmp/ansible-tmp-1727204193.8021197-23181-20409309183349/AnsiballZ_command.py && sleep 0' 19665 1727204193.88097: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204193.88101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.88147: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.88157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.88160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204193.88186: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204193.88207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.88284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204193.88302: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204193.88316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204193.88381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204193.90068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204193.90123: stderr chunk (state=3): >>><<< 19665 1727204193.90127: stdout chunk (state=3): >>><<< 19665 1727204193.90144: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204193.90149: _low_level_execute_command(): starting 19665 1727204193.90152: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.9 /root/.ansible/tmp/ansible-tmp-1727204193.8021197-23181-20409309183349/AnsiballZ_command.py && sleep 0' 19665 1727204193.90641: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204193.90645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204193.90687: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.90691: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204193.90694: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204193.90744: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204193.90748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204193.90803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204194.53691: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1355 0 --:--:-- --:--:-- --:--:-- 1355\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1310 0 --:--:-- --:--:-- --:--:-- 1316", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:56:34.037090", "end": "2024-09-24 14:56:34.535483", "delta": "0:00:00.498393", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 19665 1727204194.55036: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. <<< 19665 1727204194.55071: stderr chunk (state=3): >>><<< 19665 1727204194.55075: stdout chunk (state=3): >>><<< 19665 1727204194.55228: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 1355 0 --:--:-- --:--:-- --:--:-- 1355\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 1310 0 --:--:-- --:--:-- --:--:-- 1316", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-24 14:56:34.037090", "end": "2024-09-24 14:56:34.535483", "delta": "0:00:00.498393", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.15.87 closed. 19665 1727204194.55238: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204193.8021197-23181-20409309183349/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 19665 1727204194.55240: _low_level_execute_command(): starting 19665 1727204194.55242: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204193.8021197-23181-20409309183349/ > /dev/null 2>&1 && sleep 0' 19665 1727204194.56102: stderr chunk (state=2): >>>OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 <<< 19665 1727204194.56118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204194.56136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204194.56154: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204194.56199: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204194.56213: stderr chunk (state=3): >>>debug2: match not found <<< 19665 1727204194.56230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204194.56252: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 19665 1727204194.56267: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.15.87 is address <<< 19665 1727204194.56294: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 19665 1727204194.56309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 19665 1727204194.56324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 19665 1727204194.56342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 19665 1727204194.56359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 <<< 19665 1727204194.56374: stderr chunk (state=3): >>>debug2: match found <<< 19665 1727204194.56389: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 19665 1727204194.56471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 19665 1727204194.56495: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 19665 1727204194.56513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 19665 1727204194.56592: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 19665 1727204194.58465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 19665 1727204194.58469: stdout chunk (state=3): >>><<< 19665 1727204194.58472: stderr chunk (state=3): >>><<< 19665 1727204194.58569: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_8.7p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.15.87 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.15.87 originally 10.31.15.87 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 19665 1727204194.58573: handler run complete 19665 1727204194.58576: Evaluated conditional (False): False 19665 1727204194.58578: attempt loop complete, returning result 19665 1727204194.58581: _execute() done 19665 1727204194.58584: dumping result to json 19665 1727204194.58586: done dumping result, returning 19665 1727204194.58589: done running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity [0affcd87-79f5-0dcc-3ea6-00000000050c] 19665 1727204194.58591: sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000050c 19665 1727204194.58851: done sending task result for task 0affcd87-79f5-0dcc-3ea6-00000000050c 19665 1727204194.58854: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.498393", "end": "2024-09-24 14:56:34.535483", "rc": 0, "start": "2024-09-24 14:56:34.037090" } STDOUT: CHECK DNS AND CONNECTIVITY 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 1355 0 --:--:-- --:--:-- --:--:-- 1355 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 1310 0 --:--:-- --:--:-- --:--:-- 1316 19665 1727204194.58956: no more pending results, returning what we have 19665 1727204194.58969: results queue empty 19665 1727204194.58971: checking for any_errors_fatal 19665 1727204194.58981: done checking for any_errors_fatal 19665 1727204194.58982: checking for max_fail_percentage 19665 1727204194.58986: done checking for max_fail_percentage 19665 1727204194.58986: checking to see if all hosts have failed and the running result is not ok 19665 1727204194.58987: done checking to see if all hosts have failed 19665 1727204194.58988: getting the remaining hosts for this loop 19665 1727204194.58990: done getting the remaining hosts for this loop 19665 1727204194.58994: getting the next task for host managed-node3 19665 1727204194.59003: done getting next task for host managed-node3 19665 1727204194.59005: ^ task is: TASK: meta (flush_handlers) 19665 1727204194.59007: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204194.59012: getting variables 19665 1727204194.59014: in VariableManager get_vars() 19665 1727204194.59045: Calling all_inventory to load vars for managed-node3 19665 1727204194.59047: Calling groups_inventory to load vars for managed-node3 19665 1727204194.59051: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204194.59063: Calling all_plugins_play to load vars for managed-node3 19665 1727204194.59068: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204194.59071: Calling groups_plugins_play to load vars for managed-node3 19665 1727204194.61432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204194.63424: done with get_vars() 19665 1727204194.63460: done getting variables 19665 1727204194.63545: in VariableManager get_vars() 19665 1727204194.63557: Calling all_inventory to load vars for managed-node3 19665 1727204194.63559: Calling groups_inventory to load vars for managed-node3 19665 1727204194.63562: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204194.63569: Calling all_plugins_play to load vars for managed-node3 19665 1727204194.63571: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204194.63574: Calling groups_plugins_play to load vars for managed-node3 19665 1727204194.65156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204194.67071: done with get_vars() 19665 1727204194.67121: done queuing things up, now waiting for results queue to drain 19665 1727204194.67124: results queue empty 19665 1727204194.67125: checking for any_errors_fatal 19665 1727204194.67131: done checking for any_errors_fatal 19665 1727204194.67132: checking for max_fail_percentage 19665 1727204194.67133: done checking for max_fail_percentage 19665 1727204194.67134: checking to see if all hosts have failed and the running result is not ok 19665 1727204194.67134: done checking to see if all hosts have failed 19665 1727204194.67135: getting the remaining hosts for this loop 19665 1727204194.67136: done getting the remaining hosts for this loop 19665 1727204194.67139: getting the next task for host managed-node3 19665 1727204194.67146: done getting next task for host managed-node3 19665 1727204194.67147: ^ task is: TASK: meta (flush_handlers) 19665 1727204194.67149: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204194.67151: getting variables 19665 1727204194.67152: in VariableManager get_vars() 19665 1727204194.67163: Calling all_inventory to load vars for managed-node3 19665 1727204194.67168: Calling groups_inventory to load vars for managed-node3 19665 1727204194.67171: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204194.67184: Calling all_plugins_play to load vars for managed-node3 19665 1727204194.67191: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204194.67198: Calling groups_plugins_play to load vars for managed-node3 19665 1727204194.69513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204194.71566: done with get_vars() 19665 1727204194.71590: done getting variables 19665 1727204194.71654: in VariableManager get_vars() 19665 1727204194.71666: Calling all_inventory to load vars for managed-node3 19665 1727204194.71669: Calling groups_inventory to load vars for managed-node3 19665 1727204194.71671: Calling all_plugins_inventory to load vars for managed-node3 19665 1727204194.71676: Calling all_plugins_play to load vars for managed-node3 19665 1727204194.71678: Calling groups_plugins_inventory to load vars for managed-node3 19665 1727204194.71681: Calling groups_plugins_play to load vars for managed-node3 19665 1727204194.85071: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 19665 1727204194.87259: done with get_vars() 19665 1727204194.87394: done queuing things up, now waiting for results queue to drain 19665 1727204194.87397: results queue empty 19665 1727204194.87398: checking for any_errors_fatal 19665 1727204194.87399: done checking for any_errors_fatal 19665 1727204194.87400: checking for max_fail_percentage 19665 1727204194.87401: done checking for max_fail_percentage 19665 1727204194.87402: checking to see if all hosts have failed and the running result is not ok 19665 1727204194.87403: done checking to see if all hosts have failed 19665 1727204194.87403: getting the remaining hosts for this loop 19665 1727204194.87404: done getting the remaining hosts for this loop 19665 1727204194.87407: getting the next task for host managed-node3 19665 1727204194.87411: done getting next task for host managed-node3 19665 1727204194.87411: ^ task is: None 19665 1727204194.87413: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 19665 1727204194.87414: done queuing things up, now waiting for results queue to drain 19665 1727204194.87415: results queue empty 19665 1727204194.87416: checking for any_errors_fatal 19665 1727204194.87416: done checking for any_errors_fatal 19665 1727204194.87424: checking for max_fail_percentage 19665 1727204194.87426: done checking for max_fail_percentage 19665 1727204194.87445: checking to see if all hosts have failed and the running result is not ok 19665 1727204194.87447: done checking to see if all hosts have failed 19665 1727204194.87449: getting the next task for host managed-node3 19665 1727204194.87452: done getting next task for host managed-node3 19665 1727204194.87453: ^ task is: None 19665 1727204194.87455: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=82 changed=3 unreachable=0 failed=0 skipped=71 rescued=0 ignored=2 Tuesday 24 September 2024 14:56:34 -0400 (0:00:01.127) 0:00:45.742 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 1.79s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.72s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.67s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which packages are installed --- 1.47s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.33s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/tests_bridge_nm.yml:6 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.28s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.21s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Re-test connectivity ---------------- 1.20s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gathering Facts --------------------------------------------------------- 1.16s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gathering Facts --------------------------------------------------------- 1.15s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 1.14s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Verify DNS and network connectivity ------------------------------------- 1.13s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Gathering Facts --------------------------------------------------------- 1.02s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:3 fedora.linux_system_roles.network : Check which packages are installed --- 1.00s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.98s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 Gathering Facts --------------------------------------------------------- 0.96s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Gathering Facts --------------------------------------------------------- 0.93s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:64 fedora.linux_system_roles.network : Check which packages are installed --- 0.86s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 0.85s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bridge.yml:17 Gathering Facts --------------------------------------------------------- 0.83s /tmp/collections-G1p/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/run_tasks.yml:3 19665 1727204194.87682: RUNNING CLEANUP