10215 1727204030.77110: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-twx executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 10215 1727204030.77684: Added group all to inventory 10215 1727204030.77686: Added group ungrouped to inventory 10215 1727204030.77694: Group all now contains ungrouped 10215 1727204030.77698: Examining possible inventory source: /tmp/network-6Zh/inventory-Sfc.yml 10215 1727204031.00174: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 10215 1727204031.00255: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 10215 1727204031.00293: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 10215 1727204031.00377: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 10215 1727204031.00479: Loaded config def from plugin (inventory/script) 10215 1727204031.00482: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 10215 1727204031.00536: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 10215 1727204031.00659: Loaded config def from plugin (inventory/yaml) 10215 1727204031.00662: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 10215 1727204031.00831: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 10215 1727204031.01510: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 10215 1727204031.01514: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 10215 1727204031.01518: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 10215 1727204031.01525: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 10215 1727204031.01698: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 10215 1727204031.01813: /tmp/network-6Zh/inventory-Sfc.yml was not parsable by auto 10215 1727204031.01976: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 10215 1727204031.02036: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 10215 1727204031.02200: group all already in inventory 10215 1727204031.02209: set inventory_file for managed-node1 10215 1727204031.02214: set inventory_dir for managed-node1 10215 1727204031.02215: Added host managed-node1 to inventory 10215 1727204031.02218: Added host managed-node1 to group all 10215 1727204031.02219: set ansible_host for managed-node1 10215 1727204031.02221: set ansible_ssh_extra_args for managed-node1 10215 1727204031.02225: set inventory_file for managed-node2 10215 1727204031.02234: set inventory_dir for managed-node2 10215 1727204031.02235: Added host managed-node2 to inventory 10215 1727204031.02237: Added host managed-node2 to group all 10215 1727204031.02238: set ansible_host for managed-node2 10215 1727204031.02239: set ansible_ssh_extra_args for managed-node2 10215 1727204031.02243: set inventory_file for managed-node3 10215 1727204031.02247: set inventory_dir for managed-node3 10215 1727204031.02248: Added host managed-node3 to inventory 10215 1727204031.02250: Added host managed-node3 to group all 10215 1727204031.02251: set ansible_host for managed-node3 10215 1727204031.02252: set ansible_ssh_extra_args for managed-node3 10215 1727204031.02255: Reconcile groups and hosts in inventory. 10215 1727204031.02260: Group ungrouped now contains managed-node1 10215 1727204031.02263: Group ungrouped now contains managed-node2 10215 1727204031.02265: Group ungrouped now contains managed-node3 10215 1727204031.02369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 10215 1727204031.02601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 10215 1727204031.02780: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 10215 1727204031.02822: Loaded config def from plugin (vars/host_group_vars) 10215 1727204031.02825: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 10215 1727204031.02833: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 10215 1727204031.02844: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 10215 1727204031.03031: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 10215 1727204031.03779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204031.03908: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 10215 1727204031.03970: Loaded config def from plugin (connection/local) 10215 1727204031.03974: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 10215 1727204031.05373: Loaded config def from plugin (connection/paramiko_ssh) 10215 1727204031.05378: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 10215 1727204031.06997: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 10215 1727204031.07055: Loaded config def from plugin (connection/psrp) 10215 1727204031.07059: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 10215 1727204031.08453: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 10215 1727204031.08516: Loaded config def from plugin (connection/ssh) 10215 1727204031.08520: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 10215 1727204031.12150: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 10215 1727204031.12214: Loaded config def from plugin (connection/winrm) 10215 1727204031.12218: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 10215 1727204031.12317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 10215 1727204031.12410: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 10215 1727204031.12517: Loaded config def from plugin (shell/cmd) 10215 1727204031.12520: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 10215 1727204031.12553: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 10215 1727204031.12657: Loaded config def from plugin (shell/powershell) 10215 1727204031.12660: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 10215 1727204031.12740: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 10215 1727204031.13247: Loaded config def from plugin (shell/sh) 10215 1727204031.13250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 10215 1727204031.13296: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 10215 1727204031.13487: Loaded config def from plugin (become/runas) 10215 1727204031.13571: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 10215 1727204031.13962: Loaded config def from plugin (become/su) 10215 1727204031.13965: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 10215 1727204031.14360: Loaded config def from plugin (become/sudo) 10215 1727204031.14363: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 10215 1727204031.14497: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 10215 1727204031.15071: in VariableManager get_vars() 10215 1727204031.15099: done with get_vars() 10215 1727204031.15351: trying /usr/local/lib/python3.12/site-packages/ansible/modules 10215 1727204031.20029: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 10215 1727204031.20200: in VariableManager get_vars() 10215 1727204031.20207: done with get_vars() 10215 1727204031.20210: variable 'playbook_dir' from source: magic vars 10215 1727204031.20211: variable 'ansible_playbook_python' from source: magic vars 10215 1727204031.20212: variable 'ansible_config_file' from source: magic vars 10215 1727204031.20213: variable 'groups' from source: magic vars 10215 1727204031.20214: variable 'omit' from source: magic vars 10215 1727204031.20215: variable 'ansible_version' from source: magic vars 10215 1727204031.20216: variable 'ansible_check_mode' from source: magic vars 10215 1727204031.20217: variable 'ansible_diff_mode' from source: magic vars 10215 1727204031.20217: variable 'ansible_forks' from source: magic vars 10215 1727204031.20218: variable 'ansible_inventory_sources' from source: magic vars 10215 1727204031.20219: variable 'ansible_skip_tags' from source: magic vars 10215 1727204031.20220: variable 'ansible_limit' from source: magic vars 10215 1727204031.20221: variable 'ansible_run_tags' from source: magic vars 10215 1727204031.20222: variable 'ansible_verbosity' from source: magic vars 10215 1727204031.20296: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml 10215 1727204031.22183: in VariableManager get_vars() 10215 1727204031.22239: done with get_vars() 10215 1727204031.22258: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 10215 1727204031.24001: in VariableManager get_vars() 10215 1727204031.24021: done with get_vars() 10215 1727204031.24033: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10215 1727204031.24180: in VariableManager get_vars() 10215 1727204031.24209: done with get_vars() 10215 1727204031.24402: in VariableManager get_vars() 10215 1727204031.24425: done with get_vars() 10215 1727204031.24437: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10215 1727204031.24541: in VariableManager get_vars() 10215 1727204031.24560: done with get_vars() 10215 1727204031.25251: in VariableManager get_vars() 10215 1727204031.25266: done with get_vars() 10215 1727204031.25271: variable 'omit' from source: magic vars 10215 1727204031.25397: variable 'omit' from source: magic vars 10215 1727204031.25450: in VariableManager get_vars() 10215 1727204031.25465: done with get_vars() 10215 1727204031.25638: in VariableManager get_vars() 10215 1727204031.25653: done with get_vars() 10215 1727204031.25738: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 10215 1727204031.26126: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 10215 1727204031.26315: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 10215 1727204031.27261: in VariableManager get_vars() 10215 1727204031.27295: done with get_vars() 10215 1727204031.27882: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 10215 1727204031.28080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10215 1727204031.30267: in VariableManager get_vars() 10215 1727204031.30293: done with get_vars() 10215 1727204031.30304: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 10215 1727204031.30527: in VariableManager get_vars() 10215 1727204031.30569: done with get_vars() 10215 1727204031.30730: in VariableManager get_vars() 10215 1727204031.30752: done with get_vars() 10215 1727204031.31151: in VariableManager get_vars() 10215 1727204031.31173: done with get_vars() 10215 1727204031.31179: variable 'omit' from source: magic vars 10215 1727204031.31218: variable 'omit' from source: magic vars 10215 1727204031.31272: in VariableManager get_vars() 10215 1727204031.31291: done with get_vars() 10215 1727204031.31324: in VariableManager get_vars() 10215 1727204031.31344: done with get_vars() 10215 1727204031.31381: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 10215 1727204031.31549: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 10215 1727204031.35254: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 10215 1727204031.36334: in VariableManager get_vars() 10215 1727204031.36366: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10215 1727204031.39124: in VariableManager get_vars() 10215 1727204031.39157: done with get_vars() 10215 1727204031.39169: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 10215 1727204031.39843: in VariableManager get_vars() 10215 1727204031.39869: done with get_vars() 10215 1727204031.39950: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 10215 1727204031.39969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 10215 1727204031.40262: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 10215 1727204031.40702: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 10215 1727204031.40705: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 10215 1727204031.40746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 10215 1727204031.40779: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 10215 1727204031.41411: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 10215 1727204031.41483: Loaded config def from plugin (callback/default) 10215 1727204031.41486: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 10215 1727204031.42913: Loaded config def from plugin (callback/junit) 10215 1727204031.42916: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 10215 1727204031.42973: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 10215 1727204031.43069: Loaded config def from plugin (callback/minimal) 10215 1727204031.43072: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 10215 1727204031.43126: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 10215 1727204031.43231: Loaded config def from plugin (callback/tree) 10215 1727204031.43234: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 10215 1727204031.43404: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 10215 1727204031.43409: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_nm.yml **************************************************** 2 plays in /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 10215 1727204031.43440: in VariableManager get_vars() 10215 1727204031.43454: done with get_vars() 10215 1727204031.43459: in VariableManager get_vars() 10215 1727204031.43468: done with get_vars() 10215 1727204031.43473: variable 'omit' from source: magic vars 10215 1727204031.43522: in VariableManager get_vars() 10215 1727204031.43541: done with get_vars() 10215 1727204031.43565: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond.yml' with nm as provider] ************* 10215 1727204031.44845: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 10215 1727204031.44936: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 10215 1727204031.45270: getting the remaining hosts for this loop 10215 1727204031.45272: done getting the remaining hosts for this loop 10215 1727204031.45276: getting the next task for host managed-node3 10215 1727204031.45280: done getting next task for host managed-node3 10215 1727204031.45282: ^ task is: TASK: Gathering Facts 10215 1727204031.45284: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204031.45287: getting variables 10215 1727204031.45292: in VariableManager get_vars() 10215 1727204031.45304: Calling all_inventory to load vars for managed-node3 10215 1727204031.45307: Calling groups_inventory to load vars for managed-node3 10215 1727204031.45311: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204031.45325: Calling all_plugins_play to load vars for managed-node3 10215 1727204031.45339: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204031.45344: Calling groups_plugins_play to load vars for managed-node3 10215 1727204031.45387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204031.45455: done with get_vars() 10215 1727204031.45464: done getting variables 10215 1727204031.45754: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Tuesday 24 September 2024 14:53:51 -0400 (0:00:00.024) 0:00:00.024 ***** 10215 1727204031.45779: entering _queue_task() for managed-node3/gather_facts 10215 1727204031.45781: Creating lock for gather_facts 10215 1727204031.46463: worker is 1 (out of 1 available) 10215 1727204031.46476: exiting _queue_task() for managed-node3/gather_facts 10215 1727204031.46493: done queuing things up, now waiting for results queue to drain 10215 1727204031.46497: waiting for pending results... 10215 1727204031.47045: running TaskExecutor() for managed-node3/TASK: Gathering Facts 10215 1727204031.47183: in run() - task 12b410aa-8751-3c74-8f8e-0000000000cc 10215 1727204031.47188: variable 'ansible_search_path' from source: unknown 10215 1727204031.47194: calling self._execute() 10215 1727204031.47466: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204031.47470: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204031.47481: variable 'omit' from source: magic vars 10215 1727204031.47750: variable 'omit' from source: magic vars 10215 1727204031.47838: variable 'omit' from source: magic vars 10215 1727204031.47842: variable 'omit' from source: magic vars 10215 1727204031.48013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204031.48056: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204031.48190: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204031.48213: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204031.48227: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204031.48263: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204031.48268: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204031.48271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204031.48694: Set connection var ansible_connection to ssh 10215 1727204031.48698: Set connection var ansible_pipelining to False 10215 1727204031.48701: Set connection var ansible_shell_type to sh 10215 1727204031.48704: Set connection var ansible_timeout to 10 10215 1727204031.48706: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204031.48719: Set connection var ansible_shell_executable to /bin/sh 10215 1727204031.48721: variable 'ansible_shell_executable' from source: unknown 10215 1727204031.48723: variable 'ansible_connection' from source: unknown 10215 1727204031.48726: variable 'ansible_module_compression' from source: unknown 10215 1727204031.48728: variable 'ansible_shell_type' from source: unknown 10215 1727204031.48731: variable 'ansible_shell_executable' from source: unknown 10215 1727204031.48734: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204031.48736: variable 'ansible_pipelining' from source: unknown 10215 1727204031.48738: variable 'ansible_timeout' from source: unknown 10215 1727204031.48741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204031.49263: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204031.49267: variable 'omit' from source: magic vars 10215 1727204031.49274: starting attempt loop 10215 1727204031.49277: running the handler 10215 1727204031.49401: variable 'ansible_facts' from source: unknown 10215 1727204031.49480: _low_level_execute_command(): starting 10215 1727204031.49483: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204031.50950: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204031.50973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204031.50987: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204031.51341: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204031.51346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204031.51512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204031.53272: stdout chunk (state=3): >>>/root <<< 10215 1727204031.53372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204031.53437: stderr chunk (state=3): >>><<< 10215 1727204031.53447: stdout chunk (state=3): >>><<< 10215 1727204031.53471: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204031.53509: _low_level_execute_command(): starting 10215 1727204031.53684: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616 `" && echo ansible-tmp-1727204031.5349262-10346-34182444860616="` echo /root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616 `" ) && sleep 0' 10215 1727204031.55005: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204031.55135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204031.55221: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204031.55360: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204031.57382: stdout chunk (state=3): >>>ansible-tmp-1727204031.5349262-10346-34182444860616=/root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616 <<< 10215 1727204031.57881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204031.57885: stdout chunk (state=3): >>><<< 10215 1727204031.57888: stderr chunk (state=3): >>><<< 10215 1727204031.57893: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204031.5349262-10346-34182444860616=/root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204031.57896: variable 'ansible_module_compression' from source: unknown 10215 1727204031.57898: ANSIBALLZ: Using generic lock for ansible.legacy.setup 10215 1727204031.57900: ANSIBALLZ: Acquiring lock 10215 1727204031.57902: ANSIBALLZ: Lock acquired: 139878728192448 10215 1727204031.57904: ANSIBALLZ: Creating module 10215 1727204032.03529: ANSIBALLZ: Writing module into payload 10215 1727204032.04027: ANSIBALLZ: Writing module 10215 1727204032.04117: ANSIBALLZ: Renaming module 10215 1727204032.04130: ANSIBALLZ: Done creating module 10215 1727204032.04182: variable 'ansible_facts' from source: unknown 10215 1727204032.04200: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204032.04218: _low_level_execute_command(): starting 10215 1727204032.04229: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 10215 1727204032.05210: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204032.05266: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204032.05346: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204032.05414: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204032.05438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204032.05507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204032.07257: stdout chunk (state=3): >>>PLATFORM <<< 10215 1727204032.07347: stdout chunk (state=3): >>>Linux <<< 10215 1727204032.07354: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 10215 1727204032.07405: stdout chunk (state=3): >>>/usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 10215 1727204032.07648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204032.07695: stderr chunk (state=3): >>><<< 10215 1727204032.07712: stdout chunk (state=3): >>><<< 10215 1727204032.07809: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204032.07831 [managed-node3]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 10215 1727204032.07888: _low_level_execute_command(): starting 10215 1727204032.07907: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 10215 1727204032.08096: Sending initial data 10215 1727204032.08099: Sent initial data (1181 bytes) 10215 1727204032.08562: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204032.08635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204032.08657: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204032.08676: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204032.08777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204032.12560: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 10215 1727204032.12998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204032.13002: stdout chunk (state=3): >>><<< 10215 1727204032.13005: stderr chunk (state=3): >>><<< 10215 1727204032.13011: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204032.13041: variable 'ansible_facts' from source: unknown 10215 1727204032.13051: variable 'ansible_facts' from source: unknown 10215 1727204032.13069: variable 'ansible_module_compression' from source: unknown 10215 1727204032.13130: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 10215 1727204032.13165: variable 'ansible_facts' from source: unknown 10215 1727204032.13369: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616/AnsiballZ_setup.py 10215 1727204032.13568: Sending initial data 10215 1727204032.13682: Sent initial data (153 bytes) 10215 1727204032.14254: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204032.14303: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204032.14388: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204032.14411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204032.14442: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204032.14515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204032.16120: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204032.16177: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204032.16235: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpz2ne5buf /root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616/AnsiballZ_setup.py <<< 10215 1727204032.16239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616/AnsiballZ_setup.py" <<< 10215 1727204032.16272: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpz2ne5buf" to remote "/root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616/AnsiballZ_setup.py" <<< 10215 1727204032.18816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204032.18925: stderr chunk (state=3): >>><<< 10215 1727204032.18932: stdout chunk (state=3): >>><<< 10215 1727204032.18963: done transferring module to remote 10215 1727204032.19011: _low_level_execute_command(): starting 10215 1727204032.19023: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616/ /root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616/AnsiballZ_setup.py && sleep 0' 10215 1727204032.19691: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204032.19710: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204032.19728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204032.19760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204032.19871: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204032.19907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204032.19974: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204032.21908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204032.21919: stdout chunk (state=3): >>><<< 10215 1727204032.21963: stderr chunk (state=3): >>><<< 10215 1727204032.21995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204032.21998: _low_level_execute_command(): starting 10215 1727204032.22166: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616/AnsiballZ_setup.py && sleep 0' 10215 1727204032.22807: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204032.22811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204032.22814: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10215 1727204032.22817: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204032.22819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204032.22875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204032.22901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204032.22976: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204032.25163: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 10215 1727204032.25193: stdout chunk (state=3): >>>import _imp # builtin <<< 10215 1727204032.25233: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 10215 1727204032.25254: stdout chunk (state=3): >>>import '_weakref' # <<< 10215 1727204032.25314: stdout chunk (state=3): >>>import '_io' # <<< 10215 1727204032.25317: stdout chunk (state=3): >>>import 'marshal' # <<< 10215 1727204032.25353: stdout chunk (state=3): >>>import 'posix' # <<< 10215 1727204032.25384: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 10215 1727204032.25423: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 10215 1727204032.25475: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204032.25506: stdout chunk (state=3): >>>import '_codecs' # <<< 10215 1727204032.25523: stdout chunk (state=3): >>>import 'codecs' # <<< 10215 1727204032.25572: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 10215 1727204032.25584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 10215 1727204032.25612: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f61171b44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117183ad0> <<< 10215 1727204032.25646: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 10215 1727204032.25658: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f61171b6a20> <<< 10215 1727204032.25670: stdout chunk (state=3): >>>import '_signal' # <<< 10215 1727204032.25704: stdout chunk (state=3): >>>import '_abc' # <<< 10215 1727204032.25721: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 10215 1727204032.25756: stdout chunk (state=3): >>>import '_stat' # <<< 10215 1727204032.25766: stdout chunk (state=3): >>>import 'stat' # <<< 10215 1727204032.25853: stdout chunk (state=3): >>>import '_collections_abc' # <<< 10215 1727204032.25885: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 10215 1727204032.25949: stdout chunk (state=3): >>>import 'os' # <<< 10215 1727204032.25956: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 10215 1727204032.25984: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 10215 1727204032.26026: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 10215 1727204032.26047: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116f650a0> <<< 10215 1727204032.26118: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 10215 1727204032.26144: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116f65fd0> <<< 10215 1727204032.26163: stdout chunk (state=3): >>>import 'site' # <<< 10215 1727204032.26193: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 10215 1727204032.26593: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 10215 1727204032.26619: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 10215 1727204032.26641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204032.26659: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 10215 1727204032.26706: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 10215 1727204032.26719: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 10215 1727204032.26751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 10215 1727204032.26815: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fa3e00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 10215 1727204032.26818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 10215 1727204032.26829: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fa3ec0> <<< 10215 1727204032.26874: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 10215 1727204032.27210: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fdb800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fdbe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fbbad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fb91f0> <<< 10215 1727204032.27237: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fa0fb0> <<< 10215 1727204032.27267: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 10215 1727204032.27285: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 10215 1727204032.27314: stdout chunk (state=3): >>>import '_sre' # <<< 10215 1727204032.27326: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 10215 1727204032.27351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 10215 1727204032.27374: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 10215 1727204032.27411: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fff710> <<< 10215 1727204032.27437: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ffe330> <<< 10215 1727204032.27463: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fba1e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fa2ea0> <<< 10215 1727204032.27526: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 10215 1727204032.27553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117030740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fa0230> <<< 10215 1727204032.27575: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 10215 1727204032.27601: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6117030bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117030aa0> <<< 10215 1727204032.27642: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204032.27655: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6117030e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116f9ed50> <<< 10215 1727204032.27692: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204032.27710: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 10215 1727204032.27754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 10215 1727204032.27769: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117031550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117031220> import 'importlib.machinery' # <<< 10215 1727204032.27816: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 10215 1727204032.27838: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117032450> import 'importlib.util' # import 'runpy' # <<< 10215 1727204032.27870: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 10215 1727204032.27912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 10215 1727204032.27952: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f611704c680> <<< 10215 1727204032.27970: stdout chunk (state=3): >>>import 'errno' # <<< 10215 1727204032.27983: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f611704ddc0> <<< 10215 1727204032.28024: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 10215 1727204032.28055: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 10215 1727204032.28068: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f611704ecc0> <<< 10215 1727204032.28117: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f611704f320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f611704e210> <<< 10215 1727204032.28140: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 10215 1727204032.28151: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 10215 1727204032.28194: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f611704fda0> <<< 10215 1727204032.28229: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f611704f4d0> <<< 10215 1727204032.28257: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f61170324b0> <<< 10215 1727204032.28304: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 10215 1727204032.28309: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 10215 1727204032.28350: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 10215 1727204032.28369: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 10215 1727204032.28400: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116d57d10> <<< 10215 1727204032.28432: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 10215 1727204032.28461: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116d80800> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116d80560> <<< 10215 1727204032.28511: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116d80830> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116d80a10> <<< 10215 1727204032.28546: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116d55eb0> <<< 10215 1727204032.28550: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 10215 1727204032.28671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 10215 1727204032.28675: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 10215 1727204032.28733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116d820f0> <<< 10215 1727204032.28736: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116d80d70> <<< 10215 1727204032.28775: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117032ba0> <<< 10215 1727204032.28805: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 10215 1727204032.28838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204032.28881: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 10215 1727204032.28885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 10215 1727204032.28933: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116daa4b0> <<< 10215 1727204032.28988: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 10215 1727204032.29002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204032.29032: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 10215 1727204032.29086: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116dc65d0> <<< 10215 1727204032.29106: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 10215 1727204032.29138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 10215 1727204032.29218: stdout chunk (state=3): >>>import 'ntpath' # <<< 10215 1727204032.29245: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116dfb3b0> <<< 10215 1727204032.29281: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 10215 1727204032.29291: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 10215 1727204032.29314: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 10215 1727204032.29349: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 10215 1727204032.29448: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116e21b50> <<< 10215 1727204032.29520: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116dfb4d0> <<< 10215 1727204032.29583: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116dc7260> <<< 10215 1727204032.29618: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116c44470> <<< 10215 1727204032.29633: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116dc5610> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116d83050> <<< 10215 1727204032.29799: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 10215 1727204032.29818: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6116c44740> <<< 10215 1727204032.29985: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload__tg_2hmc/ansible_ansible.legacy.setup_payload.zip' <<< 10215 1727204032.30001: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.30150: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.30177: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 10215 1727204032.30227: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 10215 1727204032.30317: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 10215 1727204032.30350: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116cae1e0> import '_typing' # <<< 10215 1727204032.30545: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116c85100> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116c84260> # zipimport: zlib available <<< 10215 1727204032.30596: stdout chunk (state=3): >>>import 'ansible' # <<< 10215 1727204032.30622: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10215 1727204032.30640: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # <<< 10215 1727204032.30652: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.32207: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.33495: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116c875f0> <<< 10215 1727204032.33526: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204032.33560: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 10215 1727204032.33565: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 10215 1727204032.33595: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 10215 1727204032.33615: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ce1c70> <<< 10215 1727204032.33653: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ce1a00> <<< 10215 1727204032.33697: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ce1310> <<< 10215 1727204032.33718: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 10215 1727204032.33762: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ce1d60> <<< 10215 1727204032.33813: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116caee70> import 'atexit' # <<< 10215 1727204032.33830: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ce2a20> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ce2c30> <<< 10215 1727204032.33859: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 10215 1727204032.33911: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 10215 1727204032.33972: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ce3140> <<< 10215 1727204032.33975: stdout chunk (state=3): >>>import 'pwd' # <<< 10215 1727204032.34020: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 10215 1727204032.34034: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 10215 1727204032.34046: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b44ec0> <<< 10215 1727204032.34088: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116b46ae0> <<< 10215 1727204032.34122: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 10215 1727204032.34126: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 10215 1727204032.34173: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b47410> <<< 10215 1727204032.34177: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 10215 1727204032.34220: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 10215 1727204032.34231: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b485f0> <<< 10215 1727204032.34254: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 10215 1727204032.34288: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 10215 1727204032.34307: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 10215 1727204032.34366: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b4b0e0> <<< 10215 1727204032.34409: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116b4b200> <<< 10215 1727204032.34438: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b493a0> <<< 10215 1727204032.34459: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 10215 1727204032.34505: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 10215 1727204032.34524: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 10215 1727204032.34561: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 10215 1727204032.34593: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 10215 1727204032.34613: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b4f020> import '_tokenize' # <<< 10215 1727204032.34677: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b4daf0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b4d850> <<< 10215 1727204032.34703: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 10215 1727204032.34784: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b4ff80> <<< 10215 1727204032.34810: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b498b0> <<< 10215 1727204032.34859: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116b931a0> <<< 10215 1727204032.34903: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b933e0> <<< 10215 1727204032.34928: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 10215 1727204032.34956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 10215 1727204032.34967: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 10215 1727204032.35001: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116b98e60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b98c20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 10215 1727204032.35113: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 10215 1727204032.35165: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204032.35183: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116b9b3b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b99520> <<< 10215 1727204032.35212: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 10215 1727204032.35235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204032.35274: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 10215 1727204032.35277: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 10215 1727204032.35293: stdout chunk (state=3): >>>import '_string' # <<< 10215 1727204032.35335: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ba2ae0> <<< 10215 1727204032.35486: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b9b470> <<< 10215 1727204032.35567: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ba38c0> <<< 10215 1727204032.35601: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ba3920> <<< 10215 1727204032.35661: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ba3e30> <<< 10215 1727204032.35693: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b935c0> <<< 10215 1727204032.35711: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 10215 1727204032.35723: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 10215 1727204032.35742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 10215 1727204032.35776: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204032.35812: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ba6ae0> <<< 10215 1727204032.36120: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ba7fe0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ba5250> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ba6630> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ba4e30> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 10215 1727204032.36223: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.36326: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10215 1727204032.36406: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 10215 1727204032.36409: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.36537: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.36680: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.37351: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.38258: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116a30200> <<< 10215 1727204032.38262: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 10215 1727204032.38285: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a31760> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116bab3e0> <<< 10215 1727204032.38328: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 10215 1727204032.38412: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 10215 1727204032.38682: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.38750: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 10215 1727204032.38805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a31850> # zipimport: zlib available <<< 10215 1727204032.39401: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.39895: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.39969: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.40059: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 10215 1727204032.40074: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.40122: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.40252: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 10215 1727204032.40256: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.40282: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.40381: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 10215 1727204032.40410: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 10215 1727204032.40460: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.40504: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 10215 1727204032.40518: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.40792: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.41065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 10215 1727204032.41134: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 10215 1727204032.41158: stdout chunk (state=3): >>>import '_ast' # <<< 10215 1727204032.41237: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a33560> <<< 10215 1727204032.41262: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.41331: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.41423: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 10215 1727204032.41461: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 10215 1727204032.41481: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 10215 1727204032.41542: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204032.41677: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116a39be0> <<< 10215 1727204032.41748: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116a3a4b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116bab410> <<< 10215 1727204032.41764: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.41814: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.41853: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 10215 1727204032.41913: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.41953: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.42019: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.42091: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 10215 1727204032.42132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204032.42218: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116a39400> <<< 10215 1727204032.42264: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a3a660> <<< 10215 1727204032.42298: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 10215 1727204032.42314: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.42375: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.42439: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.42468: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.42537: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204032.42541: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 10215 1727204032.42573: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 10215 1727204032.42598: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 10215 1727204032.42641: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 10215 1727204032.42672: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 10215 1727204032.42684: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 10215 1727204032.42740: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ad2900> <<< 10215 1727204032.42792: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a44530> <<< 10215 1727204032.42883: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a426c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a42510> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 10215 1727204032.42930: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.42934: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.42960: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 10215 1727204032.43019: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 10215 1727204032.43051: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 10215 1727204032.43120: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.43211: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.43215: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.43239: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.43276: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.43326: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.43361: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.43418: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available <<< 10215 1727204032.43493: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.43591: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.43594: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.43642: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 10215 1727204032.43848: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.44174: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.44183: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 10215 1727204032.44219: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 10215 1727204032.44260: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ad4ec0> <<< 10215 1727204032.44263: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 10215 1727204032.44296: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 10215 1727204032.44329: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 10215 1727204032.44358: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 10215 1727204032.44401: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 10215 1727204032.44439: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115f47ef0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204032.44442: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115f48230> <<< 10215 1727204032.44495: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a48f80> <<< 10215 1727204032.44537: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a48380> <<< 10215 1727204032.44540: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ad6f60> <<< 10215 1727204032.44564: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ad7560> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 10215 1727204032.44628: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 10215 1727204032.44649: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 10215 1727204032.44672: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 10215 1727204032.44733: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204032.44759: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115f4b2f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115f4aba0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115f4ad80> <<< 10215 1727204032.44786: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115f49fd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 10215 1727204032.44909: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 10215 1727204032.44913: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115f4b350> <<< 10215 1727204032.44960: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 10215 1727204032.44979: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 10215 1727204032.45019: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115fb5e80> <<< 10215 1727204032.45022: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115f4be60> <<< 10215 1727204032.45057: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ad6ae0> import 'ansible.module_utils.facts.timeout' # <<< 10215 1727204032.45094: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 10215 1727204032.45124: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.45127: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 10215 1727204032.45231: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.45258: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available <<< 10215 1727204032.45320: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.45388: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 10215 1727204032.45399: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.45402: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.45425: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 10215 1727204032.45456: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.45494: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 10215 1727204032.45545: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.45601: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 10215 1727204032.45615: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.45660: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.45723: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 10215 1727204032.45727: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.45770: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.45829: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.45895: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.45966: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 10215 1727204032.45981: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.46523: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.47026: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available <<< 10215 1727204032.47083: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.47142: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.47172: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.47215: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 10215 1727204032.47257: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.47259: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.47300: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 10215 1727204032.47303: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.47360: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.47422: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 10215 1727204032.47438: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.47475: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.47504: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 10215 1727204032.47521: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.47543: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.47585: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 10215 1727204032.47691: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.47799: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 10215 1727204032.47803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 10215 1727204032.47832: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115fb7f80> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 10215 1727204032.47865: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 10215 1727204032.48017: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115fb6b10> import 'ansible.module_utils.facts.system.local' # <<< 10215 1727204032.48023: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.48090: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.48177: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 10215 1727204032.48181: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.48276: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.48387: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 10215 1727204032.48406: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.48461: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.48550: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 10215 1727204032.48553: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.48599: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.48642: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 10215 1727204032.48694: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 10215 1727204032.48781: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204032.49061: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115fe2150> <<< 10215 1727204032.49083: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115fd0500> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 10215 1727204032.49124: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.49200: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 10215 1727204032.49203: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.49297: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.49377: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.49517: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.49674: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 10215 1727204032.49691: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.49733: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.49865: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available <<< 10215 1727204032.49892: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 10215 1727204032.49918: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204032.50014: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115ffdd00> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115ffd910> <<< 10215 1727204032.50017: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 10215 1727204032.50020: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 10215 1727204032.50022: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.50051: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.50122: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 10215 1727204032.50280: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.50457: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 10215 1727204032.50477: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.50574: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.50684: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.50728: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.50782: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 10215 1727204032.50830: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10215 1727204032.50845: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.51004: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.51177: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 10215 1727204032.51181: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.51317: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.51468: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 10215 1727204032.51472: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.51504: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.51536: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.52172: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.52757: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 10215 1727204032.52782: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.52889: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.53017: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 10215 1727204032.53037: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.53126: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.53419: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 10215 1727204032.53433: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.53588: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 10215 1727204032.53621: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 10215 1727204032.53642: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.53677: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.53749: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 10215 1727204032.53755: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.53849: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.53963: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.54273: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.54419: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 10215 1727204032.54444: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.54495: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.54537: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 10215 1727204032.54557: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.54569: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.54591: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 10215 1727204032.54607: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.54673: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.54757: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 10215 1727204032.54779: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.54826: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.54830: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 10215 1727204032.54847: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.54895: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.54972: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 10215 1727204032.54975: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.55028: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.55113: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 10215 1727204032.55116: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.55411: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.55715: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 10215 1727204032.55721: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.55772: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.55854: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 10215 1727204032.55861: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.55889: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.55938: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 10215 1727204032.55943: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.55977: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56029: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 10215 1727204032.56038: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56053: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56103: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 10215 1727204032.56120: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56193: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56303: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 10215 1727204032.56325: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56343: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 10215 1727204032.56373: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56445: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 10215 1727204032.56448: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56474: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56483: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56532: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56587: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56662: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56759: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 10215 1727204032.56763: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 10215 1727204032.56787: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56824: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.56894: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 10215 1727204032.56897: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.57120: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.57344: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 10215 1727204032.57364: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.57403: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.57463: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 10215 1727204032.57512: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.57571: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 10215 1727204032.57594: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.57661: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.57758: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 10215 1727204032.57775: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.57878: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.58058: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 10215 1727204032.58191: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204032.58441: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 10215 1727204032.58467: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 10215 1727204032.58490: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 10215 1727204032.58528: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115e26540> <<< 10215 1727204032.58549: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115e27ce0> <<< 10215 1727204032.58597: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115e23f20> <<< 10215 1727204032.72793: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 10215 1727204032.72853: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115e6c440> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 10215 1727204032.72879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 10215 1727204032.72896: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115e6d310> <<< 10215 1727204032.72946: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 10215 1727204032.72992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 10215 1727204032.73030: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115e6f710> <<< 10215 1727204032.73034: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115e6e330> <<< 10215 1727204032.73314: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 10215 1727204032.96296: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_loadavg": {"1m": 0.4677734375, "5m": 0.390625, "15m": 0.224609375}, "ansible_lsb": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100<<< 10215 1727204032.96303: stdout chunk (state=3): >>>.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2854, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 863, "free": 2854}, "nocache": {"free": 3460, "used": 257}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 536, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251165040640, "block_size": 4096, "block_total": 64479564, "block_available": 61319590, "block_used": 3159974, "inode_total": 16384000, "inode_available": 16302335, "inode_used": 81665, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "52", "epoch": "1727204032", "epoch_int": "1727204032", "date": "2024-09-24", "time": "14:53:52", "iso8601_micro": "2024-09-24T18:53:52.931981Z", "iso8601": "2024-09-24T18:53:<<< 10215 1727204032.96334: stdout chunk (state=3): >>>52Z", "iso8601_basic": "20240924T145352931981", "iso8601_basic_short": "20240924T145352", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "", "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 10215 1727204032.96897: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 10215 1727204032.96910: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 <<< 10215 1727204032.96936: stdout chunk (state=3): >>># clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os <<< 10215 1727204032.96984: stdout chunk (state=3): >>># cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap <<< 10215 1727204032.97001: stdout chunk (state=3): >>># cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 <<< 10215 1727204032.97037: stdout chunk (state=3): >>># cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string <<< 10215 1727204032.97077: stdout chunk (state=3): >>># destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast <<< 10215 1727204032.97122: stdout chunk (state=3): >>># destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util <<< 10215 1727204032.97150: stdout chunk (state=3): >>># cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys <<< 10215 1727204032.97168: stdout chunk (state=3): >>># cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly <<< 10215 1727204032.97221: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts <<< 10215 1727204032.97238: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 10215 1727204032.97570: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 10215 1727204032.97588: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 10215 1727204032.97645: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 10215 1727204032.97673: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 10215 1727204032.97696: stdout chunk (state=3): >>># destroy ntpath <<< 10215 1727204032.97731: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 10215 1727204032.97773: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale <<< 10215 1727204032.97793: stdout chunk (state=3): >>># destroy locale # destroy select # destroy _signal # destroy _posixsubprocess <<< 10215 1727204032.97813: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 10215 1727204032.97847: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 10215 1727204032.97869: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 10215 1727204032.97932: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 10215 1727204032.97995: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 10215 1727204032.97999: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors <<< 10215 1727204032.98071: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 10215 1727204032.98074: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 10215 1727204032.98139: stdout chunk (state=3): >>># destroy json # destroy socket # destroy struct <<< 10215 1727204032.98150: stdout chunk (state=3): >>># destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 10215 1727204032.98213: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 10215 1727204032.98277: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 10215 1727204032.98313: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap <<< 10215 1727204032.98349: stdout chunk (state=3): >>># cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator <<< 10215 1727204032.98409: stdout chunk (state=3): >>># cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 <<< 10215 1727204032.98433: stdout chunk (state=3): >>># cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 10215 1727204032.98592: stdout chunk (state=3): >>># destroy sys.monitoring <<< 10215 1727204032.98640: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 10215 1727204032.98651: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 10215 1727204032.98676: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 10215 1727204032.98745: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize <<< 10215 1727204032.98748: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 10215 1727204032.98778: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 10215 1727204032.98878: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 10215 1727204032.98919: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 10215 1727204032.98937: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 10215 1727204032.98997: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins <<< 10215 1727204032.99014: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 10215 1727204032.99548: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204032.99582: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 10215 1727204032.99585: stdout chunk (state=3): >>><<< 10215 1727204032.99587: stderr chunk (state=3): >>><<< 10215 1727204032.99906: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f61171b44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117183ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f61171b6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116f650a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116f65fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fa3e00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fa3ec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fdb800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fdbe90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fbbad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fb91f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fa0fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fff710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ffe330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fba1e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fa2ea0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117030740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116fa0230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6117030bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117030aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6117030e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116f9ed50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117031550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117031220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117032450> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f611704c680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f611704ddc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f611704ecc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f611704f320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f611704e210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f611704fda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f611704f4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f61170324b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116d57d10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116d80800> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116d80560> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116d80830> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116d80a10> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116d55eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116d820f0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116d80d70> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6117032ba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116daa4b0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116dc65d0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116dfb3b0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116e21b50> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116dfb4d0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116dc7260> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116c44470> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116dc5610> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116d83050> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f6116c44740> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload__tg_2hmc/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116cae1e0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116c85100> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116c84260> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116c875f0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ce1c70> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ce1a00> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ce1310> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ce1d60> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116caee70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ce2a20> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ce2c30> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ce3140> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b44ec0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116b46ae0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b47410> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b485f0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b4b0e0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116b4b200> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b493a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b4f020> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b4daf0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b4d850> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b4ff80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b498b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116b931a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b933e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116b98e60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b98c20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116b9b3b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b99520> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ba2ae0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b9b470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ba38c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ba3920> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ba3e30> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116b935c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ba6ae0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ba7fe0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ba5250> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116ba6630> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ba4e30> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116a30200> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a31760> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116bab3e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a31850> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a33560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116a39be0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116a3a4b0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116bab410> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6116a39400> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a3a660> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ad2900> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a44530> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a426c0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a42510> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ad4ec0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115f47ef0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115f48230> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a48f80> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116a48380> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ad6f60> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ad7560> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115f4b2f0> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115f4aba0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115f4ad80> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115f49fd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115f4b350> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115fb5e80> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115f4be60> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6116ad6ae0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115fb7f80> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115fb6b10> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115fe2150> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115fd0500> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115ffdd00> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115ffd910> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f6115e26540> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115e27ce0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115e23f20> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115e6c440> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115e6d310> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115e6f710> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f6115e6e330> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_local": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_loadavg": {"1m": 0.4677734375, "5m": 0.390625, "15m": 0.224609375}, "ansible_lsb": {}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2854, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 863, "free": 2854}, "nocache": {"free": 3460, "used": 257}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 536, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251165040640, "block_size": 4096, "block_total": 64479564, "block_available": 61319590, "block_used": 3159974, "inode_total": 16384000, "inode_available": 16302335, "inode_used": 81665, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "52", "epoch": "1727204032", "epoch_int": "1727204032", "date": "2024-09-24", "time": "14:53:52", "iso8601_micro": "2024-09-24T18:53:52.931981Z", "iso8601": "2024-09-24T18:53:52Z", "iso8601_basic": "20240924T145352931981", "iso8601_basic_short": "20240924T145352", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_hostnqn": "", "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_service_mgr": "systemd", "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 10215 1727204033.02496: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204033.02574: _low_level_execute_command(): starting 10215 1727204033.02583: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204031.5349262-10346-34182444860616/ > /dev/null 2>&1 && sleep 0' 10215 1727204033.03261: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204033.03309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204033.03326: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204033.03407: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204033.03443: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204033.03467: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204033.03491: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204033.03573: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204033.05731: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204033.05735: stdout chunk (state=3): >>><<< 10215 1727204033.05737: stderr chunk (state=3): >>><<< 10215 1727204033.05841: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204033.05844: handler run complete 10215 1727204033.05970: variable 'ansible_facts' from source: unknown 10215 1727204033.06105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204033.06580: variable 'ansible_facts' from source: unknown 10215 1727204033.06691: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204033.06897: attempt loop complete, returning result 10215 1727204033.06908: _execute() done 10215 1727204033.06916: dumping result to json 10215 1727204033.06960: done dumping result, returning 10215 1727204033.06976: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-3c74-8f8e-0000000000cc] 10215 1727204033.06988: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000cc 10215 1727204033.07641: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000cc 10215 1727204033.07645: WORKER PROCESS EXITING ok: [managed-node3] 10215 1727204033.08336: no more pending results, returning what we have 10215 1727204033.08340: results queue empty 10215 1727204033.08341: checking for any_errors_fatal 10215 1727204033.08343: done checking for any_errors_fatal 10215 1727204033.08344: checking for max_fail_percentage 10215 1727204033.08346: done checking for max_fail_percentage 10215 1727204033.08347: checking to see if all hosts have failed and the running result is not ok 10215 1727204033.08348: done checking to see if all hosts have failed 10215 1727204033.08349: getting the remaining hosts for this loop 10215 1727204033.08351: done getting the remaining hosts for this loop 10215 1727204033.08355: getting the next task for host managed-node3 10215 1727204033.08362: done getting next task for host managed-node3 10215 1727204033.08364: ^ task is: TASK: meta (flush_handlers) 10215 1727204033.08366: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204033.08370: getting variables 10215 1727204033.08372: in VariableManager get_vars() 10215 1727204033.08399: Calling all_inventory to load vars for managed-node3 10215 1727204033.08402: Calling groups_inventory to load vars for managed-node3 10215 1727204033.08406: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204033.08421: Calling all_plugins_play to load vars for managed-node3 10215 1727204033.08425: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204033.08429: Calling groups_plugins_play to load vars for managed-node3 10215 1727204033.08667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204033.08943: done with get_vars() 10215 1727204033.08961: done getting variables 10215 1727204033.09042: in VariableManager get_vars() 10215 1727204033.09053: Calling all_inventory to load vars for managed-node3 10215 1727204033.09056: Calling groups_inventory to load vars for managed-node3 10215 1727204033.09059: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204033.09070: Calling all_plugins_play to load vars for managed-node3 10215 1727204033.09073: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204033.09077: Calling groups_plugins_play to load vars for managed-node3 10215 1727204033.09290: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204033.09564: done with get_vars() 10215 1727204033.09579: done queuing things up, now waiting for results queue to drain 10215 1727204033.09581: results queue empty 10215 1727204033.09582: checking for any_errors_fatal 10215 1727204033.09585: done checking for any_errors_fatal 10215 1727204033.09586: checking for max_fail_percentage 10215 1727204033.09587: done checking for max_fail_percentage 10215 1727204033.09588: checking to see if all hosts have failed and the running result is not ok 10215 1727204033.09594: done checking to see if all hosts have failed 10215 1727204033.09595: getting the remaining hosts for this loop 10215 1727204033.09597: done getting the remaining hosts for this loop 10215 1727204033.09600: getting the next task for host managed-node3 10215 1727204033.09604: done getting next task for host managed-node3 10215 1727204033.09611: ^ task is: TASK: Include the task 'el_repo_setup.yml' 10215 1727204033.09613: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204033.09616: getting variables 10215 1727204033.09617: in VariableManager get_vars() 10215 1727204033.09626: Calling all_inventory to load vars for managed-node3 10215 1727204033.09628: Calling groups_inventory to load vars for managed-node3 10215 1727204033.09631: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204033.09637: Calling all_plugins_play to load vars for managed-node3 10215 1727204033.09640: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204033.09643: Calling groups_plugins_play to load vars for managed-node3 10215 1727204033.09848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204033.10132: done with get_vars() 10215 1727204033.10141: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:11 Tuesday 24 September 2024 14:53:53 -0400 (0:00:01.644) 0:00:01.669 ***** 10215 1727204033.10228: entering _queue_task() for managed-node3/include_tasks 10215 1727204033.10230: Creating lock for include_tasks 10215 1727204033.10560: worker is 1 (out of 1 available) 10215 1727204033.10575: exiting _queue_task() for managed-node3/include_tasks 10215 1727204033.10592: done queuing things up, now waiting for results queue to drain 10215 1727204033.10599: waiting for pending results... 10215 1727204033.10837: running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' 10215 1727204033.10953: in run() - task 12b410aa-8751-3c74-8f8e-000000000006 10215 1727204033.10977: variable 'ansible_search_path' from source: unknown 10215 1727204033.11030: calling self._execute() 10215 1727204033.11115: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204033.11135: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204033.11153: variable 'omit' from source: magic vars 10215 1727204033.11283: _execute() done 10215 1727204033.11295: dumping result to json 10215 1727204033.11343: done dumping result, returning 10215 1727204033.11347: done running TaskExecutor() for managed-node3/TASK: Include the task 'el_repo_setup.yml' [12b410aa-8751-3c74-8f8e-000000000006] 10215 1727204033.11349: sending task result for task 12b410aa-8751-3c74-8f8e-000000000006 10215 1727204033.11494: no more pending results, returning what we have 10215 1727204033.11500: in VariableManager get_vars() 10215 1727204033.11533: Calling all_inventory to load vars for managed-node3 10215 1727204033.11536: Calling groups_inventory to load vars for managed-node3 10215 1727204033.11541: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204033.11556: Calling all_plugins_play to load vars for managed-node3 10215 1727204033.11560: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204033.11564: Calling groups_plugins_play to load vars for managed-node3 10215 1727204033.11811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204033.12079: done with get_vars() 10215 1727204033.12087: variable 'ansible_search_path' from source: unknown 10215 1727204033.12101: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000006 10215 1727204033.12105: WORKER PROCESS EXITING 10215 1727204033.12112: we have included files to process 10215 1727204033.12114: generating all_blocks data 10215 1727204033.12115: done generating all_blocks data 10215 1727204033.12116: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 10215 1727204033.12123: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 10215 1727204033.12126: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 10215 1727204033.12935: in VariableManager get_vars() 10215 1727204033.12954: done with get_vars() 10215 1727204033.12968: done processing included file 10215 1727204033.12970: iterating over new_blocks loaded from include file 10215 1727204033.12972: in VariableManager get_vars() 10215 1727204033.12984: done with get_vars() 10215 1727204033.12985: filtering new block on tags 10215 1727204033.13008: done filtering new block on tags 10215 1727204033.13012: in VariableManager get_vars() 10215 1727204033.13044: done with get_vars() 10215 1727204033.13047: filtering new block on tags 10215 1727204033.13067: done filtering new block on tags 10215 1727204033.13070: in VariableManager get_vars() 10215 1727204033.13082: done with get_vars() 10215 1727204033.13084: filtering new block on tags 10215 1727204033.13108: done filtering new block on tags 10215 1727204033.13111: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node3 10215 1727204033.13118: extending task lists for all hosts with included blocks 10215 1727204033.13174: done extending task lists 10215 1727204033.13176: done processing included files 10215 1727204033.13177: results queue empty 10215 1727204033.13177: checking for any_errors_fatal 10215 1727204033.13179: done checking for any_errors_fatal 10215 1727204033.13180: checking for max_fail_percentage 10215 1727204033.13181: done checking for max_fail_percentage 10215 1727204033.13182: checking to see if all hosts have failed and the running result is not ok 10215 1727204033.13183: done checking to see if all hosts have failed 10215 1727204033.13184: getting the remaining hosts for this loop 10215 1727204033.13185: done getting the remaining hosts for this loop 10215 1727204033.13187: getting the next task for host managed-node3 10215 1727204033.13193: done getting next task for host managed-node3 10215 1727204033.13195: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 10215 1727204033.13198: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204033.13200: getting variables 10215 1727204033.13201: in VariableManager get_vars() 10215 1727204033.13215: Calling all_inventory to load vars for managed-node3 10215 1727204033.13218: Calling groups_inventory to load vars for managed-node3 10215 1727204033.13221: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204033.13226: Calling all_plugins_play to load vars for managed-node3 10215 1727204033.13229: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204033.13232: Calling groups_plugins_play to load vars for managed-node3 10215 1727204033.13432: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204033.13707: done with get_vars() 10215 1727204033.13717: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:53:53 -0400 (0:00:00.035) 0:00:01.705 ***** 10215 1727204033.13802: entering _queue_task() for managed-node3/setup 10215 1727204033.14107: worker is 1 (out of 1 available) 10215 1727204033.14120: exiting _queue_task() for managed-node3/setup 10215 1727204033.14133: done queuing things up, now waiting for results queue to drain 10215 1727204033.14135: waiting for pending results... 10215 1727204033.14341: running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test 10215 1727204033.14473: in run() - task 12b410aa-8751-3c74-8f8e-0000000000dd 10215 1727204033.14497: variable 'ansible_search_path' from source: unknown 10215 1727204033.14506: variable 'ansible_search_path' from source: unknown 10215 1727204033.14558: calling self._execute() 10215 1727204033.14647: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204033.14662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204033.14679: variable 'omit' from source: magic vars 10215 1727204033.15400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204033.17903: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204033.18003: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204033.18056: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204033.18106: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204033.18167: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204033.18263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204033.18314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204033.18694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204033.18697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204033.18700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204033.18810: variable 'ansible_facts' from source: unknown 10215 1727204033.18898: variable 'network_test_required_facts' from source: task vars 10215 1727204033.18945: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 10215 1727204033.18957: variable 'omit' from source: magic vars 10215 1727204033.19005: variable 'omit' from source: magic vars 10215 1727204033.19052: variable 'omit' from source: magic vars 10215 1727204033.19083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204033.19123: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204033.19148: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204033.19174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204033.19192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204033.19235: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204033.19244: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204033.19253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204033.19378: Set connection var ansible_connection to ssh 10215 1727204033.19395: Set connection var ansible_pipelining to False 10215 1727204033.19412: Set connection var ansible_shell_type to sh 10215 1727204033.19426: Set connection var ansible_timeout to 10 10215 1727204033.19494: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204033.19498: Set connection var ansible_shell_executable to /bin/sh 10215 1727204033.19500: variable 'ansible_shell_executable' from source: unknown 10215 1727204033.19502: variable 'ansible_connection' from source: unknown 10215 1727204033.19504: variable 'ansible_module_compression' from source: unknown 10215 1727204033.19506: variable 'ansible_shell_type' from source: unknown 10215 1727204033.19510: variable 'ansible_shell_executable' from source: unknown 10215 1727204033.19512: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204033.19513: variable 'ansible_pipelining' from source: unknown 10215 1727204033.19515: variable 'ansible_timeout' from source: unknown 10215 1727204033.19525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204033.19695: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204033.19717: variable 'omit' from source: magic vars 10215 1727204033.19728: starting attempt loop 10215 1727204033.19736: running the handler 10215 1727204033.19756: _low_level_execute_command(): starting 10215 1727204033.19895: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204033.20521: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204033.20537: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204033.20554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204033.20576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204033.20598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204033.20615: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204033.20631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204033.20706: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204033.20751: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204033.20778: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204033.20801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204033.20883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204033.22632: stdout chunk (state=3): >>>/root <<< 10215 1727204033.22806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204033.22927: stderr chunk (state=3): >>><<< 10215 1727204033.22931: stdout chunk (state=3): >>><<< 10215 1727204033.22969: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204033.22991: _low_level_execute_command(): starting 10215 1727204033.23094: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497 `" && echo ansible-tmp-1727204033.2297683-10477-273726900194497="` echo /root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497 `" ) && sleep 0' 10215 1727204033.23669: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204033.23686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204033.23705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204033.23727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204033.23746: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204033.23759: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204033.23775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204033.23797: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10215 1727204033.23811: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 10215 1727204033.23910: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204033.23929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204033.23999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204033.26003: stdout chunk (state=3): >>>ansible-tmp-1727204033.2297683-10477-273726900194497=/root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497 <<< 10215 1727204033.26211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204033.26233: stdout chunk (state=3): >>><<< 10215 1727204033.26248: stderr chunk (state=3): >>><<< 10215 1727204033.26273: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204033.2297683-10477-273726900194497=/root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204033.26347: variable 'ansible_module_compression' from source: unknown 10215 1727204033.26416: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 10215 1727204033.26493: variable 'ansible_facts' from source: unknown 10215 1727204033.26709: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497/AnsiballZ_setup.py 10215 1727204033.26930: Sending initial data 10215 1727204033.26933: Sent initial data (154 bytes) 10215 1727204033.27606: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204033.27687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204033.27734: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204033.27770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204033.29396: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204033.29443: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204033.29495: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpwxzf7svh /root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497/AnsiballZ_setup.py <<< 10215 1727204033.29502: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497/AnsiballZ_setup.py" <<< 10215 1727204033.29533: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpwxzf7svh" to remote "/root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497/AnsiballZ_setup.py" <<< 10215 1727204033.31925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204033.31971: stderr chunk (state=3): >>><<< 10215 1727204033.31983: stdout chunk (state=3): >>><<< 10215 1727204033.32024: done transferring module to remote 10215 1727204033.32060: _low_level_execute_command(): starting 10215 1727204033.32071: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497/ /root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497/AnsiballZ_setup.py && sleep 0' 10215 1727204033.32807: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204033.32878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204033.32898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204033.32922: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204033.33004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204033.34895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204033.34922: stderr chunk (state=3): >>><<< 10215 1727204033.34926: stdout chunk (state=3): >>><<< 10215 1727204033.34944: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204033.34947: _low_level_execute_command(): starting 10215 1727204033.34955: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497/AnsiballZ_setup.py && sleep 0' 10215 1727204033.35591: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204033.35603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204033.35746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204033.35750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204033.35753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 10215 1727204033.35756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204033.35776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204033.35860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204033.38016: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 10215 1727204033.38046: stdout chunk (state=3): >>>import _imp # builtin <<< 10215 1727204033.38082: stdout chunk (state=3): >>>import '_thread' # <<< 10215 1727204033.38095: stdout chunk (state=3): >>>import '_warnings' # import '_weakref' # <<< 10215 1727204033.38160: stdout chunk (state=3): >>>import '_io' # <<< 10215 1727204033.38170: stdout chunk (state=3): >>>import 'marshal' # <<< 10215 1727204033.38195: stdout chunk (state=3): >>>import 'posix' # <<< 10215 1727204033.38263: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 10215 1727204033.38268: stdout chunk (state=3): >>># installing zipimport hook import 'time' # <<< 10215 1727204033.38292: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 10215 1727204033.38334: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 10215 1727204033.38351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 10215 1727204033.38380: stdout chunk (state=3): >>>import 'codecs' # <<< 10215 1727204033.38431: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 10215 1727204033.38453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c92c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c8fbad0> <<< 10215 1727204033.38496: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 10215 1727204033.38548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c92ea20> import '_signal' # <<< 10215 1727204033.38555: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 10215 1727204033.38572: stdout chunk (state=3): >>>import 'io' # <<< 10215 1727204033.38607: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 10215 1727204033.38702: stdout chunk (state=3): >>>import '_collections_abc' # <<< 10215 1727204033.38748: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 10215 1727204033.38808: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 10215 1727204033.38811: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages <<< 10215 1727204033.38875: stdout chunk (state=3): >>>Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 10215 1727204033.38879: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 10215 1727204033.38902: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c71d0a0> <<< 10215 1727204033.38960: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 10215 1727204033.39000: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c71dfd0> <<< 10215 1727204033.39045: stdout chunk (state=3): >>>import 'site' # <<< 10215 1727204033.39157: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 10215 1727204033.39739: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c75bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c75bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 10215 1727204033.39742: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 10215 1727204033.39829: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204033.39834: stdout chunk (state=3): >>>import 'itertools' # <<< 10215 1727204033.39895: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7938c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c793f20> import '_collections' # <<< 10215 1727204033.39950: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c773b90> <<< 10215 1727204033.39976: stdout chunk (state=3): >>>import '_functools' # <<< 10215 1727204033.39992: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7712b0> <<< 10215 1727204033.40096: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c759070> <<< 10215 1727204033.40140: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 10215 1727204033.40379: stdout chunk (state=3): >>>import '_sre' # <<< 10215 1727204033.40398: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7b7740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7b6360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7722a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c75af60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7e8770> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7582f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py <<< 10215 1727204033.40412: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 10215 1727204033.40695: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c7e8c20> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7e8ad0> <<< 10215 1727204033.40774: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c7e8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c756e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7e9520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7e9220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7ea420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c804650> import 'errno' # <<< 10215 1727204033.40807: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204033.40821: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c805d90> <<< 10215 1727204033.40845: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 10215 1727204033.40849: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 10215 1727204033.40881: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c806c90> <<< 10215 1727204033.41138: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c8072f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c8061e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c807d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c8074a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7ea480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 10215 1727204033.41143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 10215 1727204033.41165: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 10215 1727204033.41194: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c50fcb0> <<< 10215 1727204033.41347: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c5387a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c538500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c538620> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c538980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c50de50> <<< 10215 1727204033.41359: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 10215 1727204033.41539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c53a060> <<< 10215 1727204033.41543: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c538ce0> <<< 10215 1727204033.41577: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7ea600> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 10215 1727204033.41649: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 10215 1727204033.41698: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 10215 1727204033.41726: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c566420> <<< 10215 1727204033.41824: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 10215 1727204033.41830: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 10215 1727204033.41887: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c57e540> <<< 10215 1727204033.41914: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 10215 1727204033.42047: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py <<< 10215 1727204033.42060: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c5b72c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 10215 1727204033.42094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 10215 1727204033.42301: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 10215 1727204033.42320: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c5dda60> <<< 10215 1727204033.42381: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c5b73e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c57f1d0> <<< 10215 1727204033.42431: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c3f43e0> <<< 10215 1727204033.42434: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c57d580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c53afc0> <<< 10215 1727204033.42807: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4f8c57d940> <<< 10215 1727204033.42813: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_sh59xap_/ansible_setup_payload.zip' # zipimport: zlib available <<< 10215 1727204033.42951: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.43216: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c4620c0> import '_typing' # <<< 10215 1727204033.43351: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c438fb0> <<< 10215 1727204033.43402: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c438110> # zipimport: zlib available import 'ansible' # # zipimport: zlib available <<< 10215 1727204033.43435: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.43658: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 10215 1727204033.45021: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.46299: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c43bf50> <<< 10215 1727204033.46330: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204033.46363: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 10215 1727204033.46410: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 10215 1727204033.46463: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c491b80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c491910> <<< 10215 1727204033.46527: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c491220> <<< 10215 1727204033.46531: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 10215 1727204033.46578: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c4919a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c462d50> <<< 10215 1727204033.46613: stdout chunk (state=3): >>>import 'atexit' # <<< 10215 1727204033.46636: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c492900> <<< 10215 1727204033.46664: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c492b40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 10215 1727204033.46733: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 10215 1727204033.46737: stdout chunk (state=3): >>>import '_locale' # <<< 10215 1727204033.46776: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c493020> <<< 10215 1727204033.46804: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 10215 1727204033.46927: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 10215 1727204033.46945: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2f8da0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c2fa9c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 10215 1727204033.46975: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2fb2f0> <<< 10215 1727204033.47009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 10215 1727204033.47051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 10215 1727204033.47129: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2fc4d0> <<< 10215 1727204033.47132: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 10215 1727204033.47135: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 10215 1727204033.47185: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2fef30> <<< 10215 1727204033.47282: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c2ff290> <<< 10215 1727204033.47327: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2fd220> <<< 10215 1727204033.47331: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 10215 1727204033.47334: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 10215 1727204033.47412: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 10215 1727204033.47520: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c302f30> <<< 10215 1727204033.47523: stdout chunk (state=3): >>>import '_tokenize' # <<< 10215 1727204033.47542: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c301a00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c301760> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 10215 1727204033.47613: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c303e30> <<< 10215 1727204033.47711: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2fd700> <<< 10215 1727204033.47716: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c347050> <<< 10215 1727204033.47772: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c347200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 10215 1727204033.47775: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 10215 1727204033.47853: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c34cda0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c34cb60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 10215 1727204033.48119: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 10215 1727204033.48123: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c34f2f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c34d490> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204033.48126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 10215 1727204033.48147: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 10215 1727204033.48194: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c356a80> <<< 10215 1727204033.48342: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c34f410> <<< 10215 1727204033.48427: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c357890> <<< 10215 1727204033.48457: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c3576e0> <<< 10215 1727204033.48552: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c357bf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c347500> <<< 10215 1727204033.48595: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 10215 1727204033.48599: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 10215 1727204033.48696: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 10215 1727204033.48700: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204033.48721: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c35b350> <<< 10215 1727204033.48868: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204033.48906: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c35c740> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c359b20> <<< 10215 1727204033.48959: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c35aed0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c359730> # zipimport: zlib available <<< 10215 1727204033.48996: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 10215 1727204033.49095: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.49184: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.49227: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 10215 1727204033.49280: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.49397: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.49558: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.50219: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.51095: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 10215 1727204033.51232: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c1e4860> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1e5760> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c35f0e0> import 'ansible.module_utils.compat.selinux' # <<< 10215 1727204033.51256: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.51281: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # <<< 10215 1727204033.51300: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.51466: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.51656: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 10215 1727204033.51676: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1e53a0> <<< 10215 1727204033.51698: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.52248: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.52805: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.52896: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.52978: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 10215 1727204033.53004: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.53034: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.53066: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 10215 1727204033.53094: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.53166: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.53283: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 10215 1727204033.53306: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.53335: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 10215 1727204033.53379: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.53429: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 10215 1727204033.53432: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.53788: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.54199: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1e65a0> # zipimport: zlib available <<< 10215 1727204033.54257: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.54343: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 10215 1727204033.54357: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 10215 1727204033.54417: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 10215 1727204033.54542: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204033.54601: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c1ee2a0> <<< 10215 1727204033.54664: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c1eec00> <<< 10215 1727204033.54704: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1e7200> # zipimport: zlib available <<< 10215 1727204033.54746: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.54783: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 10215 1727204033.54786: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.54962: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.54965: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.54980: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.55017: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 10215 1727204033.55058: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204033.55149: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204033.55212: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c1eda30> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1eeea0> <<< 10215 1727204033.55242: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 10215 1727204033.55245: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.55395: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.55413: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.55618: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 10215 1727204033.55684: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c282fc0> <<< 10215 1727204033.55765: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1fbe30> <<< 10215 1727204033.55875: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1f6f00> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1f6c90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.55896: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 10215 1727204033.55949: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 10215 1727204033.56099: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 10215 1727204033.56103: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.56128: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.56146: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.56227: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.56357: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.56360: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 10215 1727204033.56363: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.56473: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.56547: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.56578: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 10215 1727204033.56594: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.56784: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.57334: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c285dc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 10215 1727204033.57338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 10215 1727204033.57340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 10215 1727204033.57353: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b744500> <<< 10215 1727204033.57381: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204033.57398: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b744860> <<< 10215 1727204033.57613: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2655b0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2648c0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2844d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c284ef0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 10215 1727204033.57660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 10215 1727204033.57666: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 10215 1727204033.57670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 10215 1727204033.57753: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b747830> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b7470e0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b7472c0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b746540> <<< 10215 1727204033.57766: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 10215 1727204033.57877: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 10215 1727204033.57982: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b7478c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 10215 1727204033.57985: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b7ae330> <<< 10215 1727204033.58011: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c358bf0> <<< 10215 1727204033.58039: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2840b0> <<< 10215 1727204033.58079: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 10215 1727204033.58190: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.58232: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 10215 1727204033.58256: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.58413: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 10215 1727204033.58441: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.58469: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 10215 1727204033.58481: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.58532: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.58616: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 10215 1727204033.58871: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.58886: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.58966: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 10215 1727204033.59511: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.59993: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 10215 1727204033.60010: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.60158: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.60190: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 10215 1727204033.60211: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.60238: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.60382: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 10215 1727204033.60386: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.60407: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 10215 1727204033.60423: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.60448: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.60487: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available <<< 10215 1727204033.60595: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 10215 1727204033.60652: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.60747: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 10215 1727204033.60776: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b7afc50> <<< 10215 1727204033.60810: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 10215 1727204033.60837: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 10215 1727204033.61007: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b7aef60> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 10215 1727204033.61053: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.61160: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 10215 1727204033.61164: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.61243: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.61344: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 10215 1727204033.61348: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.61464: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.61493: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 10215 1727204033.61592: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.61600: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 10215 1727204033.61671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 10215 1727204033.61718: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204033.61793: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b7e24b0> <<< 10215 1727204033.61985: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b7ca1b0> import 'ansible.module_utils.facts.system.python' # <<< 10215 1727204033.62018: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.62059: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.62124: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 10215 1727204033.62253: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.62281: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.62313: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.62484: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.62615: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 10215 1727204033.62712: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 10215 1727204033.62825: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 10215 1727204033.62867: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204033.62872: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b7fde50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b7fda00> <<< 10215 1727204033.62874: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # <<< 10215 1727204033.62892: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.62913: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 10215 1727204033.62959: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.62962: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.63049: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 10215 1727204033.63359: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.63376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 10215 1727204033.63475: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.63591: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.63626: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.63708: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 10215 1727204033.63712: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # <<< 10215 1727204033.63806: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.63900: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.64059: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 10215 1727204033.64251: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.64611: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.65050: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.65635: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 10215 1727204033.65649: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.65781: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.66007: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 10215 1727204033.66010: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.66113: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 10215 1727204033.66126: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.66290: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.66474: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available <<< 10215 1727204033.66495: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 10215 1727204033.66517: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.66729: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 10215 1727204033.66732: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.66835: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.67060: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.67290: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 10215 1727204033.67305: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 10215 1727204033.67379: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.67400: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available <<< 10215 1727204033.67492: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 10215 1727204033.67536: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.67614: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 10215 1727204033.67632: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.67648: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.67712: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 10215 1727204033.67752: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.67919: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 10215 1727204033.67924: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.68028: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 10215 1727204033.68034: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.68369: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.68555: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 10215 1727204033.68588: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.68626: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.68709: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 10215 1727204033.68712: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.68816: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 10215 1727204033.68830: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.68860: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 10215 1727204033.68875: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.68925: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.69110: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 10215 1727204033.69320: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available <<< 10215 1727204033.69324: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.69337: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.69388: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.69440: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.69517: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.69597: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 10215 1727204033.69657: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 10215 1727204033.69676: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.69729: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 10215 1727204033.69765: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.69959: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.70193: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 10215 1727204033.70197: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.70233: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.70306: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 10215 1727204033.70350: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.70422: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 10215 1727204033.70503: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.70595: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 10215 1727204033.70637: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.70735: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.70856: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 10215 1727204033.70902: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204033.71541: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 10215 1727204033.71547: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 10215 1727204033.71564: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 10215 1727204033.71624: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b627770> <<< 10215 1727204033.71717: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b6242f0> <<< 10215 1727204033.71733: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b624ef0> <<< 10215 1727204033.72396: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "53", "epoch": "1727204033", "epoch_int": "1727204033", "date": "2024-09-24", "time": "14:53:53", "iso8601_micro": "2024-09-24T18:53:53.710330Z", "iso8601": "2024-09-24T18:53:53Z", "iso8601_basic": "20240924T145353710330", "iso8601_basic_short": "20240924T145353", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuG<<< 10215 1727204033.72417: stdout chunk (state=3): >>>RyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_fips": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 10215 1727204033.73157: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ <<< 10215 1727204033.73167: stdout chunk (state=3): >>># cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid <<< 10215 1727204033.73171: stdout chunk (state=3): >>># cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common <<< 10215 1727204033.73279: stdout chunk (state=3): >>># destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl <<< 10215 1727204033.73284: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd <<< 10215 1727204033.73296: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai <<< 10215 1727204033.73363: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 10215 1727204033.73675: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 10215 1727204033.73905: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 10215 1727204033.73911: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 10215 1727204033.73916: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 10215 1727204033.74018: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle <<< 10215 1727204033.74048: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 10215 1727204033.74070: stdout chunk (state=3): >>># destroy multiprocessing.process # destroy unicodedata<<< 10215 1727204033.74179: stdout chunk (state=3): >>> # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 10215 1727204033.74182: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux <<< 10215 1727204033.74185: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios <<< 10215 1727204033.74198: stdout chunk (state=3): >>># destroy errno # destroy json <<< 10215 1727204033.74220: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob <<< 10215 1727204033.74264: stdout chunk (state=3): >>># destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 10215 1727204033.74288: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 10215 1727204033.74435: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 10215 1727204033.74446: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external <<< 10215 1727204033.74449: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 10215 1727204033.74452: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 10215 1727204033.74456: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools <<< 10215 1727204033.74599: stdout chunk (state=3): >>># cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 10215 1727204033.74616: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 10215 1727204033.74720: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 10215 1727204033.74736: stdout chunk (state=3): >>># destroy _collections <<< 10215 1727204033.74768: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath <<< 10215 1727204033.74783: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize <<< 10215 1727204033.74916: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 10215 1727204033.74998: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 10215 1727204033.75031: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading <<< 10215 1727204033.75142: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 10215 1727204033.75581: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204033.75672: stderr chunk (state=3): >>><<< 10215 1727204033.75685: stdout chunk (state=3): >>><<< 10215 1727204033.76024: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c92c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c8fbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c92ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c71d0a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c71dfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c75bec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c75bf80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7938c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c793f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c773b90> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7712b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c759070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7b7740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7b6360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7722a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c75af60> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7e8770> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7582f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c7e8c20> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7e8ad0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c7e8e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c756e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7e9520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7e9220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7ea420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c804650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c805d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c806c90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c8072f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c8061e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c807d70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c8074a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7ea480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c50fcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c5387a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c538500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c538620> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c538980> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c50de50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c53a060> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c538ce0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c7ea600> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c566420> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c57e540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c5b72c0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c5dda60> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c5b73e0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c57f1d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c3f43e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c57d580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c53afc0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f4f8c57d940> # zipimport: found 103 names in '/tmp/ansible_setup_payload_sh59xap_/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c4620c0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c438fb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c438110> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c43bf50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c491b80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c491910> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c491220> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c4919a0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c462d50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c492900> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c492b40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c493020> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2f8da0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c2fa9c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2fb2f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2fc4d0> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2fef30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c2ff290> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2fd220> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c302f30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c301a00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c301760> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c303e30> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2fd700> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c347050> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c347200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c34cda0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c34cb60> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c34f2f0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c34d490> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c356a80> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c34f410> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c357890> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c3576e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c357bf0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c347500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c35b350> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c35c740> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c359b20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c35aed0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c359730> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c1e4860> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1e5760> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c35f0e0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1e53a0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1e65a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c1ee2a0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c1eec00> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1e7200> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8c1eda30> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1eeea0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c282fc0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1fbe30> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1f6f00> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c1f6c90> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c285dc0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b744500> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b744860> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2655b0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2648c0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2844d0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c284ef0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b747830> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b7470e0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b7472c0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b746540> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b7478c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b7ae330> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c358bf0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8c2840b0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b7afc50> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b7aef60> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b7e24b0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b7ca1b0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b7fde50> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b7fda00> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f4f8b627770> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b6242f0> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f4f8b624ef0> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "53", "epoch": "1727204033", "epoch_int": "1727204033", "date": "2024-09-24", "time": "14:53:53", "iso8601_micro": "2024-09-24T18:53:53.710330Z", "iso8601": "2024-09-24T18:53:53Z", "iso8601_basic": "20240924T145353710330", "iso8601_basic_short": "20240924T145353", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_fips": false, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 10215 1727204033.78005: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204033.78011: _low_level_execute_command(): starting 10215 1727204033.78014: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204033.2297683-10477-273726900194497/ > /dev/null 2>&1 && sleep 0' 10215 1727204033.78366: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204033.78369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204033.78372: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 10215 1727204033.78375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204033.78377: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204033.78657: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204033.78661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204033.78664: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204033.78701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204033.80798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204033.80802: stdout chunk (state=3): >>><<< 10215 1727204033.80805: stderr chunk (state=3): >>><<< 10215 1727204033.80807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204033.80810: handler run complete 10215 1727204033.80848: variable 'ansible_facts' from source: unknown 10215 1727204033.80951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204033.81147: variable 'ansible_facts' from source: unknown 10215 1727204033.81260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204033.81355: attempt loop complete, returning result 10215 1727204033.81364: _execute() done 10215 1727204033.81372: dumping result to json 10215 1727204033.81403: done dumping result, returning 10215 1727204033.81423: done running TaskExecutor() for managed-node3/TASK: Gather the minimum subset of ansible_facts required by the network role test [12b410aa-8751-3c74-8f8e-0000000000dd] 10215 1727204033.81513: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000dd ok: [managed-node3] 10215 1727204033.82064: no more pending results, returning what we have 10215 1727204033.82068: results queue empty 10215 1727204033.82069: checking for any_errors_fatal 10215 1727204033.82071: done checking for any_errors_fatal 10215 1727204033.82072: checking for max_fail_percentage 10215 1727204033.82074: done checking for max_fail_percentage 10215 1727204033.82075: checking to see if all hosts have failed and the running result is not ok 10215 1727204033.82076: done checking to see if all hosts have failed 10215 1727204033.82077: getting the remaining hosts for this loop 10215 1727204033.82078: done getting the remaining hosts for this loop 10215 1727204033.82083: getting the next task for host managed-node3 10215 1727204033.82129: done getting next task for host managed-node3 10215 1727204033.82134: ^ task is: TASK: Check if system is ostree 10215 1727204033.82137: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204033.82141: getting variables 10215 1727204033.82143: in VariableManager get_vars() 10215 1727204033.82170: Calling all_inventory to load vars for managed-node3 10215 1727204033.82174: Calling groups_inventory to load vars for managed-node3 10215 1727204033.82178: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204033.82194: Calling all_plugins_play to load vars for managed-node3 10215 1727204033.82199: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204033.82203: Calling groups_plugins_play to load vars for managed-node3 10215 1727204033.82777: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000dd 10215 1727204033.82784: WORKER PROCESS EXITING 10215 1727204033.83034: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204033.83601: done with get_vars() 10215 1727204033.83616: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:53:53 -0400 (0:00:00.700) 0:00:02.405 ***** 10215 1727204033.83859: entering _queue_task() for managed-node3/stat 10215 1727204033.84487: worker is 1 (out of 1 available) 10215 1727204033.84504: exiting _queue_task() for managed-node3/stat 10215 1727204033.84516: done queuing things up, now waiting for results queue to drain 10215 1727204033.84518: waiting for pending results... 10215 1727204033.84787: running TaskExecutor() for managed-node3/TASK: Check if system is ostree 10215 1727204033.85325: in run() - task 12b410aa-8751-3c74-8f8e-0000000000df 10215 1727204033.85329: variable 'ansible_search_path' from source: unknown 10215 1727204033.85332: variable 'ansible_search_path' from source: unknown 10215 1727204033.85335: calling self._execute() 10215 1727204033.85407: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204033.85448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204033.85529: variable 'omit' from source: magic vars 10215 1727204033.86096: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204033.86535: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204033.86732: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204033.86736: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204033.86780: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204033.86910: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204033.86952: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204033.87000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204033.87040: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204033.87205: Evaluated conditional (not __network_is_ostree is defined): True 10215 1727204033.87278: variable 'omit' from source: magic vars 10215 1727204033.87283: variable 'omit' from source: magic vars 10215 1727204033.87328: variable 'omit' from source: magic vars 10215 1727204033.87393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204033.87435: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204033.87497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204033.87501: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204033.87512: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204033.87552: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204033.87562: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204033.87572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204033.87723: Set connection var ansible_connection to ssh 10215 1727204033.87736: Set connection var ansible_pipelining to False 10215 1727204033.87795: Set connection var ansible_shell_type to sh 10215 1727204033.87798: Set connection var ansible_timeout to 10 10215 1727204033.87801: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204033.87803: Set connection var ansible_shell_executable to /bin/sh 10215 1727204033.87815: variable 'ansible_shell_executable' from source: unknown 10215 1727204033.87830: variable 'ansible_connection' from source: unknown 10215 1727204033.87844: variable 'ansible_module_compression' from source: unknown 10215 1727204033.87851: variable 'ansible_shell_type' from source: unknown 10215 1727204033.87859: variable 'ansible_shell_executable' from source: unknown 10215 1727204033.87867: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204033.87876: variable 'ansible_pipelining' from source: unknown 10215 1727204033.87884: variable 'ansible_timeout' from source: unknown 10215 1727204033.87935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204033.88099: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204033.88119: variable 'omit' from source: magic vars 10215 1727204033.88130: starting attempt loop 10215 1727204033.88137: running the handler 10215 1727204033.88167: _low_level_execute_command(): starting 10215 1727204033.88180: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204033.89058: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204033.89210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204033.89216: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204033.89228: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204033.90938: stdout chunk (state=3): >>>/root <<< 10215 1727204033.91096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204033.91101: stdout chunk (state=3): >>><<< 10215 1727204033.91104: stderr chunk (state=3): >>><<< 10215 1727204033.91123: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204033.91136: _low_level_execute_command(): starting 10215 1727204033.91144: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087 `" && echo ansible-tmp-1727204033.9112303-10539-240453949698087="` echo /root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087 `" ) && sleep 0' 10215 1727204033.91579: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204033.91583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204033.91586: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204033.91588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204033.91650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204033.91654: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204033.91719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204033.93702: stdout chunk (state=3): >>>ansible-tmp-1727204033.9112303-10539-240453949698087=/root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087 <<< 10215 1727204033.93821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204033.93867: stderr chunk (state=3): >>><<< 10215 1727204033.93870: stdout chunk (state=3): >>><<< 10215 1727204033.93886: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204033.9112303-10539-240453949698087=/root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204033.93931: variable 'ansible_module_compression' from source: unknown 10215 1727204033.93980: ANSIBALLZ: Using lock for stat 10215 1727204033.93983: ANSIBALLZ: Acquiring lock 10215 1727204033.93986: ANSIBALLZ: Lock acquired: 139878726275184 10215 1727204033.93991: ANSIBALLZ: Creating module 10215 1727204034.04343: ANSIBALLZ: Writing module into payload 10215 1727204034.04427: ANSIBALLZ: Writing module 10215 1727204034.04444: ANSIBALLZ: Renaming module 10215 1727204034.04451: ANSIBALLZ: Done creating module 10215 1727204034.04466: variable 'ansible_facts' from source: unknown 10215 1727204034.04518: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087/AnsiballZ_stat.py 10215 1727204034.04630: Sending initial data 10215 1727204034.04633: Sent initial data (153 bytes) 10215 1727204034.05112: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204034.05116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204034.05118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204034.05120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204034.05170: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204034.05173: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204034.05223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204034.06955: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 10215 1727204034.06963: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204034.06992: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204034.07025: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpb9t707mu /root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087/AnsiballZ_stat.py <<< 10215 1727204034.07034: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087/AnsiballZ_stat.py" <<< 10215 1727204034.07062: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpb9t707mu" to remote "/root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087/AnsiballZ_stat.py" <<< 10215 1727204034.09953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204034.10022: stderr chunk (state=3): >>><<< 10215 1727204034.10025: stdout chunk (state=3): >>><<< 10215 1727204034.10044: done transferring module to remote 10215 1727204034.10061: _low_level_execute_command(): starting 10215 1727204034.10064: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087/ /root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087/AnsiballZ_stat.py && sleep 0' 10215 1727204034.10534: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204034.10538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204034.10540: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204034.10543: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204034.10595: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204034.10602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204034.10642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204034.12511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204034.12557: stderr chunk (state=3): >>><<< 10215 1727204034.12560: stdout chunk (state=3): >>><<< 10215 1727204034.12577: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204034.12580: _low_level_execute_command(): starting 10215 1727204034.12586: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087/AnsiballZ_stat.py && sleep 0' 10215 1727204034.13032: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204034.13036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204034.13038: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204034.13040: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204034.13093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204034.13097: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204034.13144: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204034.15310: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 10215 1727204034.15343: stdout chunk (state=3): >>>import _imp # builtin <<< 10215 1727204034.15379: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 10215 1727204034.15386: stdout chunk (state=3): >>>import '_weakref' # <<< 10215 1727204034.15455: stdout chunk (state=3): >>>import '_io' # <<< 10215 1727204034.15461: stdout chunk (state=3): >>>import 'marshal' # <<< 10215 1727204034.15499: stdout chunk (state=3): >>>import 'posix' # <<< 10215 1727204034.15534: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 10215 1727204034.15572: stdout chunk (state=3): >>>import 'time' # <<< 10215 1727204034.15576: stdout chunk (state=3): >>>import 'zipimport' # <<< 10215 1727204034.15582: stdout chunk (state=3): >>># installed zipimport hook <<< 10215 1727204034.15637: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 10215 1727204034.15643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204034.15655: stdout chunk (state=3): >>>import '_codecs' # <<< 10215 1727204034.15683: stdout chunk (state=3): >>>import 'codecs' # <<< 10215 1727204034.15720: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 10215 1727204034.15750: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 10215 1727204034.15762: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3032c4d0> <<< 10215 1727204034.15766: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb302fbad0> <<< 10215 1727204034.15797: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py <<< 10215 1727204034.15800: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 10215 1727204034.15811: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3032ea20> <<< 10215 1727204034.15832: stdout chunk (state=3): >>>import '_signal' # <<< 10215 1727204034.15858: stdout chunk (state=3): >>>import '_abc' # <<< 10215 1727204034.15864: stdout chunk (state=3): >>>import 'abc' # <<< 10215 1727204034.15883: stdout chunk (state=3): >>>import 'io' # <<< 10215 1727204034.15919: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 10215 1727204034.16015: stdout chunk (state=3): >>>import '_collections_abc' # <<< 10215 1727204034.16045: stdout chunk (state=3): >>>import 'genericpath' # <<< 10215 1727204034.16053: stdout chunk (state=3): >>>import 'posixpath' # <<< 10215 1727204034.16074: stdout chunk (state=3): >>>import 'os' # <<< 10215 1727204034.16097: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 10215 1727204034.16118: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 10215 1727204034.16138: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' <<< 10215 1727204034.16151: stdout chunk (state=3): >>>Adding directory: '/usr/lib/python3.12/site-packages' <<< 10215 1727204034.16163: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 10215 1727204034.16187: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 10215 1727204034.16195: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 10215 1727204034.16214: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301410a0> <<< 10215 1727204034.16282: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 10215 1727204034.16297: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204034.16305: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30141fd0> <<< 10215 1727204034.16330: stdout chunk (state=3): >>>import 'site' # <<< 10215 1727204034.16365: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 10215 1727204034.16607: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 10215 1727204034.16626: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 10215 1727204034.16644: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 10215 1727204034.16662: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204034.16677: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 10215 1727204034.16721: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 10215 1727204034.16741: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 10215 1727204034.16766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 10215 1727204034.16780: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3017fec0> <<< 10215 1727204034.16810: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 10215 1727204034.16823: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 10215 1727204034.16847: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3017ff80> <<< 10215 1727204034.16877: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 10215 1727204034.16901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 10215 1727204034.16932: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 10215 1727204034.16977: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204034.16999: stdout chunk (state=3): >>>import 'itertools' # <<< 10215 1727204034.17024: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 10215 1727204034.17031: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301b78c0> <<< 10215 1727204034.17065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 10215 1727204034.17068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 10215 1727204034.17080: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301b7f50> <<< 10215 1727204034.17085: stdout chunk (state=3): >>>import '_collections' # <<< 10215 1727204034.17136: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30197b60> <<< 10215 1727204034.17150: stdout chunk (state=3): >>>import '_functools' # <<< 10215 1727204034.17179: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301952b0> <<< 10215 1727204034.17278: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3017d070> <<< 10215 1727204034.17306: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 10215 1727204034.17325: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 10215 1727204034.17343: stdout chunk (state=3): >>>import '_sre' # <<< 10215 1727204034.17366: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 10215 1727204034.17394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 10215 1727204034.17414: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 10215 1727204034.17421: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 10215 1727204034.17454: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301db890> <<< 10215 1727204034.17476: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301da4b0> <<< 10215 1727204034.17510: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 10215 1727204034.17514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301962a0> <<< 10215 1727204034.17521: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301d8bc0> <<< 10215 1727204034.17569: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 10215 1727204034.17588: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020c800> <<< 10215 1727204034.17594: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3017c2f0> <<< 10215 1727204034.17614: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 10215 1727204034.17645: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.17651: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb3020ccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020cb60> <<< 10215 1727204034.17700: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.17706: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb3020cf50> <<< 10215 1727204034.17712: stdout chunk (state=3): >>>import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3017ae10> <<< 10215 1727204034.17744: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204034.17770: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 10215 1727204034.17809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 10215 1727204034.17825: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020d610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020d2e0> <<< 10215 1727204034.17834: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 10215 1727204034.17862: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 10215 1727204034.17882: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020e510> <<< 10215 1727204034.17903: stdout chunk (state=3): >>>import 'importlib.util' # <<< 10215 1727204034.17910: stdout chunk (state=3): >>>import 'runpy' # <<< 10215 1727204034.17933: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 10215 1727204034.17970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 10215 1727204034.17997: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 10215 1727204034.18004: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 10215 1727204034.18011: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30228740> <<< 10215 1727204034.18027: stdout chunk (state=3): >>>import 'errno' # <<< 10215 1727204034.18059: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.18067: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb30229e80> <<< 10215 1727204034.18087: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 10215 1727204034.18102: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 10215 1727204034.18119: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 10215 1727204034.18144: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3022ad80> <<< 10215 1727204034.18182: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.18192: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb3022b3e0> <<< 10215 1727204034.18207: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3022a2d0> <<< 10215 1727204034.18223: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 10215 1727204034.18236: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 10215 1727204034.18271: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.18293: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb3022be30> <<< 10215 1727204034.18298: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3022b560> <<< 10215 1727204034.18343: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020e570> <<< 10215 1727204034.18362: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 10215 1727204034.18404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 10215 1727204034.18413: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 10215 1727204034.18439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 10215 1727204034.18469: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.18475: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fff7d40> <<< 10215 1727204034.18504: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 10215 1727204034.18513: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 10215 1727204034.18531: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.18537: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb300207d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30020530> <<< 10215 1727204034.18564: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb30020800> <<< 10215 1727204034.18603: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.18611: stdout chunk (state=3): >>>import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb300209e0> <<< 10215 1727204034.18625: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fff5ee0> <<< 10215 1727204034.18646: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 10215 1727204034.18752: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 10215 1727204034.18774: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 10215 1727204034.18795: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 10215 1727204034.18801: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30022000> <<< 10215 1727204034.18825: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30020c80> <<< 10215 1727204034.18849: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020ec60> <<< 10215 1727204034.18872: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 10215 1727204034.18930: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204034.18950: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 10215 1727204034.18997: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 10215 1727204034.19028: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3004e390> <<< 10215 1727204034.19076: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 10215 1727204034.19095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204034.19117: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 10215 1727204034.19138: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 10215 1727204034.19193: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30066540> <<< 10215 1727204034.19213: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 10215 1727204034.19255: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 10215 1727204034.19315: stdout chunk (state=3): >>>import 'ntpath' # <<< 10215 1727204034.19342: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3009f2f0> <<< 10215 1727204034.19369: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 10215 1727204034.19405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 10215 1727204034.19433: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 10215 1727204034.19473: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 10215 1727204034.19570: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb300c5a90> <<< 10215 1727204034.19645: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3009f410> <<< 10215 1727204034.19698: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb300671d0> <<< 10215 1727204034.19722: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 10215 1727204034.19730: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fe9c440> <<< 10215 1727204034.19748: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30065580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30022f30> <<< 10215 1727204034.19848: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 10215 1727204034.19872: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7feb2fe9c6e0> <<< 10215 1727204034.19953: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_rkkz57bo/ansible_stat_payload.zip' <<< 10215 1727204034.19958: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.20113: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.20150: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 10215 1727204034.20153: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 10215 1727204034.20206: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 10215 1727204034.20282: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 10215 1727204034.20316: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fef6120> <<< 10215 1727204034.20331: stdout chunk (state=3): >>>import '_typing' # <<< 10215 1727204034.20536: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fecd0a0> <<< 10215 1727204034.20539: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fecc200> <<< 10215 1727204034.20544: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.20578: stdout chunk (state=3): >>>import 'ansible' # <<< 10215 1727204034.20583: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.20610: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.20624: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.20638: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 10215 1727204034.20647: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.22228: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.23508: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 10215 1727204034.23516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fecf1a0> <<< 10215 1727204034.23541: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204034.23577: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py <<< 10215 1727204034.23583: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 10215 1727204034.23609: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 10215 1727204034.23641: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2ff21b50> <<< 10215 1727204034.23688: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2ff218e0> <<< 10215 1727204034.23717: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2ff211f0> <<< 10215 1727204034.23744: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 10215 1727204034.23750: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 10215 1727204034.23791: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2ff21c40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fef6bd0> <<< 10215 1727204034.23807: stdout chunk (state=3): >>>import 'atexit' # <<< 10215 1727204034.23837: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.23842: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2ff22900> <<< 10215 1727204034.23866: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.23871: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2ff22b40> <<< 10215 1727204034.23892: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 10215 1727204034.23942: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 10215 1727204034.23954: stdout chunk (state=3): >>>import '_locale' # <<< 10215 1727204034.24011: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2ff23020> <<< 10215 1727204034.24013: stdout chunk (state=3): >>>import 'pwd' # <<< 10215 1727204034.24034: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 10215 1727204034.24060: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 10215 1727204034.24102: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd84dd0> <<< 10215 1727204034.24130: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.24138: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fd869f0> <<< 10215 1727204034.24161: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 10215 1727204034.24182: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 10215 1727204034.24188: stdout chunk (state=3): >>> <<< 10215 1727204034.24223: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd873b0> <<< 10215 1727204034.24244: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 10215 1727204034.24272: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 10215 1727204034.24293: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd88590> <<< 10215 1727204034.24314: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 10215 1727204034.24351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 10215 1727204034.24376: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 10215 1727204034.24381: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 10215 1727204034.24435: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd8b080> <<< 10215 1727204034.24476: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fd8b1d0> <<< 10215 1727204034.24503: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd89340> <<< 10215 1727204034.24521: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 10215 1727204034.24549: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 10215 1727204034.24577: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py <<< 10215 1727204034.24587: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 10215 1727204034.24598: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 10215 1727204034.24629: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 10215 1727204034.24659: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 10215 1727204034.24663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 10215 1727204034.24676: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd8ef60> <<< 10215 1727204034.24684: stdout chunk (state=3): >>>import '_tokenize' # <<< 10215 1727204034.24755: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd8da60> <<< 10215 1727204034.24763: stdout chunk (state=3): >>>import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd8d7c0> <<< 10215 1727204034.24784: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 10215 1727204034.24790: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 10215 1727204034.24867: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd8fe60> <<< 10215 1727204034.24900: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd89850> <<< 10215 1727204034.24926: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.24932: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fdd70b0> <<< 10215 1727204034.24960: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc'<<< 10215 1727204034.24966: stdout chunk (state=3): >>> import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fdd72c0> <<< 10215 1727204034.24986: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 10215 1727204034.25005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 10215 1727204034.25028: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 10215 1727204034.25068: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.25074: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fddcd70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fddcb30> <<< 10215 1727204034.25099: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 10215 1727204034.25208: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 10215 1727204034.25263: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fddf290> <<< 10215 1727204034.25271: stdout chunk (state=3): >>>import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fddd400> <<< 10215 1727204034.25297: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 10215 1727204034.25342: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204034.25362: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 10215 1727204034.25384: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 10215 1727204034.25444: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fde2ab0> <<< 10215 1727204034.25595: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fddf440> <<< 10215 1727204034.25674: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.25679: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fde38c0> <<< 10215 1727204034.25711: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fde38f0> <<< 10215 1727204034.25762: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.25768: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fde3ce0> <<< 10215 1727204034.25792: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fdd74a0> <<< 10215 1727204034.25814: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py <<< 10215 1727204034.25822: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 10215 1727204034.25837: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 10215 1727204034.25865: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 10215 1727204034.25892: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.25922: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fde7380> <<< 10215 1727204034.26117: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.26132: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.26139: stdout chunk (state=3): >>>import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fde8710> <<< 10215 1727204034.26146: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fde5af0> <<< 10215 1727204034.26183: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fde6ea0> <<< 10215 1727204034.26187: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fde5700> <<< 10215 1727204034.26214: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.26220: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 10215 1727204034.26247: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.26349: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.26450: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.26471: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.26484: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 10215 1727204034.26497: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.26518: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 10215 1727204034.26529: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.26673: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.26817: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.27486: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.28171: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 10215 1727204034.28188: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 10215 1727204034.28202: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # <<< 10215 1727204034.28218: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 10215 1727204034.28241: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204034.28295: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.28303: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fe708f0> <<< 10215 1727204034.28407: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 10215 1727204034.28422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 10215 1727204034.28433: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fe715e0> <<< 10215 1727204034.28448: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fdeb170> <<< 10215 1727204034.28498: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 10215 1727204034.28514: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.28532: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.28557: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 10215 1727204034.28563: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.28740: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.28922: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 10215 1727204034.28933: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 10215 1727204034.28948: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fe71640> <<< 10215 1727204034.28957: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.29524: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.30079: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.30161: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.30260: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 10215 1727204034.30263: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.30307: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.30349: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 10215 1727204034.30361: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.30439: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.30554: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 10215 1727204034.30577: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.30589: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 10215 1727204034.30607: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.30651: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.30698: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 10215 1727204034.30710: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.30987: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.31268: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 10215 1727204034.31339: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 10215 1727204034.31350: stdout chunk (state=3): >>>import '_ast' # <<< 10215 1727204034.31444: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fe72510> <<< 10215 1727204034.31451: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.31536: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.31620: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 10215 1727204034.31635: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 10215 1727204034.31645: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 10215 1727204034.31668: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 10215 1727204034.31676: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 10215 1727204034.31756: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.31883: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fc7e150> <<< 10215 1727204034.31934: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.31946: stdout chunk (state=3): >>># extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fc7eae0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fe73050> <<< 10215 1727204034.31965: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.32012: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.32056: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 10215 1727204034.32065: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.32116: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.32160: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.32224: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.32296: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 10215 1727204034.32340: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204034.32427: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 10215 1727204034.32435: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fc7d850> <<< 10215 1727204034.32472: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fc7ecf0> <<< 10215 1727204034.32505: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 10215 1727204034.32521: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.32587: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.32659: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.32682: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.32729: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 10215 1727204034.32757: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 10215 1727204034.32776: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 10215 1727204034.32799: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 10215 1727204034.32854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 10215 1727204034.32881: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 10215 1727204034.32895: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 10215 1727204034.32954: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd0eea0> <<< 10215 1727204034.33003: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fc8bd10> <<< 10215 1727204034.33086: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fc86cf0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fc86b40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 10215 1727204034.33103: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.33129: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.33159: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 10215 1727204034.33220: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 10215 1727204034.33232: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.33249: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 10215 1727204034.33271: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.33416: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.33634: stdout chunk (state=3): >>># zipimport: zlib available <<< 10215 1727204034.33772: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 10215 1727204034.33778: stdout chunk (state=3): >>># destroy __main__ <<< 10215 1727204034.34099: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 10215 1727204034.34105: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins <<< 10215 1727204034.34142: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io <<< 10215 1727204034.34157: stdout chunk (state=3): >>># cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg <<< 10215 1727204034.34173: stdout chunk (state=3): >>># cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma <<< 10215 1727204034.34194: stdout chunk (state=3): >>># cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible <<< 10215 1727204034.34229: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader <<< 10215 1727204034.34261: stdout chunk (state=3): >>># cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 <<< 10215 1727204034.34271: stdout chunk (state=3): >>># cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 10215 1727204034.34504: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 10215 1727204034.34517: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 10215 1727204034.34530: stdout chunk (state=3): >>># destroy _bz2 <<< 10215 1727204034.34553: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 10215 1727204034.34567: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 10215 1727204034.34609: stdout chunk (state=3): >>># destroy ntpath <<< 10215 1727204034.34624: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal <<< 10215 1727204034.34643: stdout chunk (state=3): >>># destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 10215 1727204034.34659: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl <<< 10215 1727204034.34680: stdout chunk (state=3): >>># destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 10215 1727204034.34687: stdout chunk (state=3): >>># destroy selectors # destroy errno <<< 10215 1727204034.34707: stdout chunk (state=3): >>># destroy array # destroy datetime <<< 10215 1727204034.34728: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 10215 1727204034.34740: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 10215 1727204034.34788: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux <<< 10215 1727204034.34798: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 <<< 10215 1727204034.34823: stdout chunk (state=3): >>># cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 10215 1727204034.34830: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 10215 1727204034.34852: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings <<< 10215 1727204034.34870: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler <<< 10215 1727204034.34890: stdout chunk (state=3): >>># destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools <<< 10215 1727204034.34904: stdout chunk (state=3): >>># cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 10215 1727204034.34930: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 10215 1727204034.34934: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread <<< 10215 1727204034.34949: stdout chunk (state=3): >>># cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 10215 1727204034.34955: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 10215 1727204034.35085: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 10215 1727204034.35103: stdout chunk (state=3): >>># destroy _collections <<< 10215 1727204034.35133: stdout chunk (state=3): >>># destroy platform <<< 10215 1727204034.35137: stdout chunk (state=3): >>># destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 10215 1727204034.35152: stdout chunk (state=3): >>># destroy tokenize <<< 10215 1727204034.35163: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 10215 1727204034.35187: stdout chunk (state=3): >>># destroy _typing <<< 10215 1727204034.35208: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator <<< 10215 1727204034.35215: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 10215 1727204034.35235: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules<<< 10215 1727204034.35241: stdout chunk (state=3): >>> # destroy _frozen_importlib <<< 10215 1727204034.35325: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 10215 1727204034.35333: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 10215 1727204034.35339: stdout chunk (state=3): >>># destroy time <<< 10215 1727204034.35361: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 10215 1727204034.35391: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re <<< 10215 1727204034.35417: stdout chunk (state=3): >>># destroy itertools # destroy _abc <<< 10215 1727204034.35429: stdout chunk (state=3): >>># destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 10215 1727204034.35441: stdout chunk (state=3): >>># clear sys.audit hooks <<< 10215 1727204034.35795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204034.35861: stderr chunk (state=3): >>><<< 10215 1727204034.35865: stdout chunk (state=3): >>><<< 10215 1727204034.35935: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3032c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb302fbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3032ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301410a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30141fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3017fec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3017ff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301b78c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301b7f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30197b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301952b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3017d070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301db890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301da4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301962a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb301d8bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020c800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3017c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb3020ccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020cb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb3020cf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3017ae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020d610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020d2e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020e510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30228740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb30229e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3022ad80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb3022b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3022a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb3022be30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3022b560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020e570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fff7d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb300207d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30020530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb30020800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb300209e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fff5ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30022000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30020c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3020ec60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3004e390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30066540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3009f2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb300c5a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb3009f410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb300671d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fe9c440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30065580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb30022f30> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7feb2fe9c6e0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_rkkz57bo/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fef6120> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fecd0a0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fecc200> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fecf1a0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2ff21b50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2ff218e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2ff211f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2ff21c40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fef6bd0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2ff22900> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2ff22b40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2ff23020> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd84dd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fd869f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd873b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd88590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd8b080> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fd8b1d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd89340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd8ef60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd8da60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd8d7c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd8fe60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd89850> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fdd70b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fdd72c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fddcd70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fddcb30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fddf290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fddd400> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fde2ab0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fddf440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fde38c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fde38f0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fde3ce0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fdd74a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fde7380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fde8710> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fde5af0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fde6ea0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fde5700> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fe708f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fe715e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fdeb170> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fe71640> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fe72510> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fc7e150> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fc7eae0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fe73050> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb2fc7d850> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fc7ecf0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fd0eea0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fc8bd10> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fc86cf0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb2fc86b40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 10215 1727204034.36484: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204034.36487: _low_level_execute_command(): starting 10215 1727204034.36492: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204033.9112303-10539-240453949698087/ > /dev/null 2>&1 && sleep 0' 10215 1727204034.36669: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204034.36672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204034.36675: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 10215 1727204034.36678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204034.36680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204034.36735: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204034.36743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204034.36745: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204034.36780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204034.38661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204034.38710: stderr chunk (state=3): >>><<< 10215 1727204034.38714: stdout chunk (state=3): >>><<< 10215 1727204034.38732: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204034.38738: handler run complete 10215 1727204034.38759: attempt loop complete, returning result 10215 1727204034.38762: _execute() done 10215 1727204034.38765: dumping result to json 10215 1727204034.38770: done dumping result, returning 10215 1727204034.38778: done running TaskExecutor() for managed-node3/TASK: Check if system is ostree [12b410aa-8751-3c74-8f8e-0000000000df] 10215 1727204034.38784: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000df 10215 1727204034.38880: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000df 10215 1727204034.38883: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 10215 1727204034.38954: no more pending results, returning what we have 10215 1727204034.38958: results queue empty 10215 1727204034.38959: checking for any_errors_fatal 10215 1727204034.38968: done checking for any_errors_fatal 10215 1727204034.38969: checking for max_fail_percentage 10215 1727204034.38971: done checking for max_fail_percentage 10215 1727204034.38971: checking to see if all hosts have failed and the running result is not ok 10215 1727204034.38972: done checking to see if all hosts have failed 10215 1727204034.38973: getting the remaining hosts for this loop 10215 1727204034.38975: done getting the remaining hosts for this loop 10215 1727204034.38979: getting the next task for host managed-node3 10215 1727204034.38987: done getting next task for host managed-node3 10215 1727204034.38991: ^ task is: TASK: Set flag to indicate system is ostree 10215 1727204034.38994: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.38997: getting variables 10215 1727204034.38999: in VariableManager get_vars() 10215 1727204034.39031: Calling all_inventory to load vars for managed-node3 10215 1727204034.39034: Calling groups_inventory to load vars for managed-node3 10215 1727204034.39037: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.39049: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.39052: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.39055: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.39261: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.39446: done with get_vars() 10215 1727204034.39455: done getting variables 10215 1727204034.39535: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:53:54 -0400 (0:00:00.556) 0:00:02.962 ***** 10215 1727204034.39559: entering _queue_task() for managed-node3/set_fact 10215 1727204034.39560: Creating lock for set_fact 10215 1727204034.39770: worker is 1 (out of 1 available) 10215 1727204034.39784: exiting _queue_task() for managed-node3/set_fact 10215 1727204034.39798: done queuing things up, now waiting for results queue to drain 10215 1727204034.39800: waiting for pending results... 10215 1727204034.39953: running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree 10215 1727204034.40024: in run() - task 12b410aa-8751-3c74-8f8e-0000000000e0 10215 1727204034.40036: variable 'ansible_search_path' from source: unknown 10215 1727204034.40039: variable 'ansible_search_path' from source: unknown 10215 1727204034.40070: calling self._execute() 10215 1727204034.40138: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.40142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.40155: variable 'omit' from source: magic vars 10215 1727204034.40542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204034.40738: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204034.40775: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204034.40809: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204034.40840: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204034.40935: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204034.40956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204034.40978: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204034.41001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204034.41102: Evaluated conditional (not __network_is_ostree is defined): True 10215 1727204034.41107: variable 'omit' from source: magic vars 10215 1727204034.41144: variable 'omit' from source: magic vars 10215 1727204034.41245: variable '__ostree_booted_stat' from source: set_fact 10215 1727204034.41281: variable 'omit' from source: magic vars 10215 1727204034.41303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204034.41330: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204034.41347: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204034.41364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204034.41374: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204034.41402: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204034.41405: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.41413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.41492: Set connection var ansible_connection to ssh 10215 1727204034.41498: Set connection var ansible_pipelining to False 10215 1727204034.41505: Set connection var ansible_shell_type to sh 10215 1727204034.41514: Set connection var ansible_timeout to 10 10215 1727204034.41521: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204034.41529: Set connection var ansible_shell_executable to /bin/sh 10215 1727204034.41547: variable 'ansible_shell_executable' from source: unknown 10215 1727204034.41551: variable 'ansible_connection' from source: unknown 10215 1727204034.41553: variable 'ansible_module_compression' from source: unknown 10215 1727204034.41558: variable 'ansible_shell_type' from source: unknown 10215 1727204034.41560: variable 'ansible_shell_executable' from source: unknown 10215 1727204034.41566: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.41574: variable 'ansible_pipelining' from source: unknown 10215 1727204034.41577: variable 'ansible_timeout' from source: unknown 10215 1727204034.41579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.41660: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204034.41669: variable 'omit' from source: magic vars 10215 1727204034.41675: starting attempt loop 10215 1727204034.41680: running the handler 10215 1727204034.41692: handler run complete 10215 1727204034.41702: attempt loop complete, returning result 10215 1727204034.41705: _execute() done 10215 1727204034.41708: dumping result to json 10215 1727204034.41715: done dumping result, returning 10215 1727204034.41722: done running TaskExecutor() for managed-node3/TASK: Set flag to indicate system is ostree [12b410aa-8751-3c74-8f8e-0000000000e0] 10215 1727204034.41728: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000e0 10215 1727204034.41807: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000e0 10215 1727204034.41811: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 10215 1727204034.41866: no more pending results, returning what we have 10215 1727204034.41869: results queue empty 10215 1727204034.41870: checking for any_errors_fatal 10215 1727204034.41878: done checking for any_errors_fatal 10215 1727204034.41879: checking for max_fail_percentage 10215 1727204034.41880: done checking for max_fail_percentage 10215 1727204034.41881: checking to see if all hosts have failed and the running result is not ok 10215 1727204034.41882: done checking to see if all hosts have failed 10215 1727204034.41883: getting the remaining hosts for this loop 10215 1727204034.41884: done getting the remaining hosts for this loop 10215 1727204034.41888: getting the next task for host managed-node3 10215 1727204034.41898: done getting next task for host managed-node3 10215 1727204034.41900: ^ task is: TASK: Fix CentOS6 Base repo 10215 1727204034.41903: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.41906: getting variables 10215 1727204034.41908: in VariableManager get_vars() 10215 1727204034.41932: Calling all_inventory to load vars for managed-node3 10215 1727204034.41936: Calling groups_inventory to load vars for managed-node3 10215 1727204034.41940: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.41950: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.41953: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.41962: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.42094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.42243: done with get_vars() 10215 1727204034.42251: done getting variables 10215 1727204034.42343: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:53:54 -0400 (0:00:00.028) 0:00:02.990 ***** 10215 1727204034.42364: entering _queue_task() for managed-node3/copy 10215 1727204034.42549: worker is 1 (out of 1 available) 10215 1727204034.42564: exiting _queue_task() for managed-node3/copy 10215 1727204034.42575: done queuing things up, now waiting for results queue to drain 10215 1727204034.42577: waiting for pending results... 10215 1727204034.42718: running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo 10215 1727204034.42791: in run() - task 12b410aa-8751-3c74-8f8e-0000000000e2 10215 1727204034.42804: variable 'ansible_search_path' from source: unknown 10215 1727204034.42807: variable 'ansible_search_path' from source: unknown 10215 1727204034.42839: calling self._execute() 10215 1727204034.42893: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.42901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.42910: variable 'omit' from source: magic vars 10215 1727204034.43320: variable 'ansible_distribution' from source: facts 10215 1727204034.43338: Evaluated conditional (ansible_distribution == 'CentOS'): False 10215 1727204034.43341: when evaluation is False, skipping this task 10215 1727204034.43344: _execute() done 10215 1727204034.43346: dumping result to json 10215 1727204034.43353: done dumping result, returning 10215 1727204034.43361: done running TaskExecutor() for managed-node3/TASK: Fix CentOS6 Base repo [12b410aa-8751-3c74-8f8e-0000000000e2] 10215 1727204034.43364: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000e2 10215 1727204034.43456: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000e2 10215 1727204034.43459: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 10215 1727204034.43528: no more pending results, returning what we have 10215 1727204034.43531: results queue empty 10215 1727204034.43532: checking for any_errors_fatal 10215 1727204034.43535: done checking for any_errors_fatal 10215 1727204034.43536: checking for max_fail_percentage 10215 1727204034.43537: done checking for max_fail_percentage 10215 1727204034.43538: checking to see if all hosts have failed and the running result is not ok 10215 1727204034.43540: done checking to see if all hosts have failed 10215 1727204034.43541: getting the remaining hosts for this loop 10215 1727204034.43542: done getting the remaining hosts for this loop 10215 1727204034.43546: getting the next task for host managed-node3 10215 1727204034.43551: done getting next task for host managed-node3 10215 1727204034.43554: ^ task is: TASK: Include the task 'enable_epel.yml' 10215 1727204034.43556: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.43560: getting variables 10215 1727204034.43561: in VariableManager get_vars() 10215 1727204034.43584: Calling all_inventory to load vars for managed-node3 10215 1727204034.43586: Calling groups_inventory to load vars for managed-node3 10215 1727204034.43590: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.43598: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.43600: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.43602: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.43756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.43907: done with get_vars() 10215 1727204034.43915: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:53:54 -0400 (0:00:00.016) 0:00:03.006 ***** 10215 1727204034.43980: entering _queue_task() for managed-node3/include_tasks 10215 1727204034.44160: worker is 1 (out of 1 available) 10215 1727204034.44174: exiting _queue_task() for managed-node3/include_tasks 10215 1727204034.44184: done queuing things up, now waiting for results queue to drain 10215 1727204034.44186: waiting for pending results... 10215 1727204034.44327: running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' 10215 1727204034.44398: in run() - task 12b410aa-8751-3c74-8f8e-0000000000e3 10215 1727204034.44411: variable 'ansible_search_path' from source: unknown 10215 1727204034.44415: variable 'ansible_search_path' from source: unknown 10215 1727204034.44446: calling self._execute() 10215 1727204034.44499: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.44506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.44518: variable 'omit' from source: magic vars 10215 1727204034.44892: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204034.46555: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204034.46609: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204034.46640: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204034.46678: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204034.46702: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204034.46773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204034.46797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204034.46827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204034.46857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204034.46870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204034.46967: variable '__network_is_ostree' from source: set_fact 10215 1727204034.46981: Evaluated conditional (not __network_is_ostree | d(false)): True 10215 1727204034.46987: _execute() done 10215 1727204034.46992: dumping result to json 10215 1727204034.46997: done dumping result, returning 10215 1727204034.47003: done running TaskExecutor() for managed-node3/TASK: Include the task 'enable_epel.yml' [12b410aa-8751-3c74-8f8e-0000000000e3] 10215 1727204034.47011: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000e3 10215 1727204034.47128: no more pending results, returning what we have 10215 1727204034.47134: in VariableManager get_vars() 10215 1727204034.47168: Calling all_inventory to load vars for managed-node3 10215 1727204034.47171: Calling groups_inventory to load vars for managed-node3 10215 1727204034.47175: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.47186: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.47192: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.47196: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.47365: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000e3 10215 1727204034.47369: WORKER PROCESS EXITING 10215 1727204034.47381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.47534: done with get_vars() 10215 1727204034.47541: variable 'ansible_search_path' from source: unknown 10215 1727204034.47542: variable 'ansible_search_path' from source: unknown 10215 1727204034.47571: we have included files to process 10215 1727204034.47572: generating all_blocks data 10215 1727204034.47573: done generating all_blocks data 10215 1727204034.47577: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 10215 1727204034.47578: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 10215 1727204034.47580: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 10215 1727204034.48130: done processing included file 10215 1727204034.48132: iterating over new_blocks loaded from include file 10215 1727204034.48133: in VariableManager get_vars() 10215 1727204034.48142: done with get_vars() 10215 1727204034.48143: filtering new block on tags 10215 1727204034.48161: done filtering new block on tags 10215 1727204034.48164: in VariableManager get_vars() 10215 1727204034.48173: done with get_vars() 10215 1727204034.48174: filtering new block on tags 10215 1727204034.48183: done filtering new block on tags 10215 1727204034.48184: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node3 10215 1727204034.48191: extending task lists for all hosts with included blocks 10215 1727204034.48269: done extending task lists 10215 1727204034.48270: done processing included files 10215 1727204034.48271: results queue empty 10215 1727204034.48272: checking for any_errors_fatal 10215 1727204034.48275: done checking for any_errors_fatal 10215 1727204034.48276: checking for max_fail_percentage 10215 1727204034.48277: done checking for max_fail_percentage 10215 1727204034.48278: checking to see if all hosts have failed and the running result is not ok 10215 1727204034.48278: done checking to see if all hosts have failed 10215 1727204034.48279: getting the remaining hosts for this loop 10215 1727204034.48280: done getting the remaining hosts for this loop 10215 1727204034.48282: getting the next task for host managed-node3 10215 1727204034.48284: done getting next task for host managed-node3 10215 1727204034.48286: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 10215 1727204034.48288: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.48291: getting variables 10215 1727204034.48292: in VariableManager get_vars() 10215 1727204034.48298: Calling all_inventory to load vars for managed-node3 10215 1727204034.48300: Calling groups_inventory to load vars for managed-node3 10215 1727204034.48301: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.48305: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.48311: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.48314: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.48433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.48578: done with get_vars() 10215 1727204034.48585: done getting variables 10215 1727204034.48640: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 10215 1727204034.48798: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:53:54 -0400 (0:00:00.048) 0:00:03.055 ***** 10215 1727204034.48836: entering _queue_task() for managed-node3/command 10215 1727204034.48837: Creating lock for command 10215 1727204034.49039: worker is 1 (out of 1 available) 10215 1727204034.49052: exiting _queue_task() for managed-node3/command 10215 1727204034.49064: done queuing things up, now waiting for results queue to drain 10215 1727204034.49066: waiting for pending results... 10215 1727204034.49228: running TaskExecutor() for managed-node3/TASK: Create EPEL 39 10215 1727204034.49311: in run() - task 12b410aa-8751-3c74-8f8e-0000000000fd 10215 1727204034.49319: variable 'ansible_search_path' from source: unknown 10215 1727204034.49323: variable 'ansible_search_path' from source: unknown 10215 1727204034.49353: calling self._execute() 10215 1727204034.49410: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.49422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.49432: variable 'omit' from source: magic vars 10215 1727204034.49809: variable 'ansible_distribution' from source: facts 10215 1727204034.49814: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10215 1727204034.49818: when evaluation is False, skipping this task 10215 1727204034.49820: _execute() done 10215 1727204034.49823: dumping result to json 10215 1727204034.49826: done dumping result, returning 10215 1727204034.49828: done running TaskExecutor() for managed-node3/TASK: Create EPEL 39 [12b410aa-8751-3c74-8f8e-0000000000fd] 10215 1727204034.49831: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000fd 10215 1727204034.49908: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000fd 10215 1727204034.49911: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10215 1727204034.50051: no more pending results, returning what we have 10215 1727204034.50054: results queue empty 10215 1727204034.50055: checking for any_errors_fatal 10215 1727204034.50057: done checking for any_errors_fatal 10215 1727204034.50058: checking for max_fail_percentage 10215 1727204034.50059: done checking for max_fail_percentage 10215 1727204034.50060: checking to see if all hosts have failed and the running result is not ok 10215 1727204034.50061: done checking to see if all hosts have failed 10215 1727204034.50062: getting the remaining hosts for this loop 10215 1727204034.50064: done getting the remaining hosts for this loop 10215 1727204034.50068: getting the next task for host managed-node3 10215 1727204034.50074: done getting next task for host managed-node3 10215 1727204034.50076: ^ task is: TASK: Install yum-utils package 10215 1727204034.50080: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.50084: getting variables 10215 1727204034.50086: in VariableManager get_vars() 10215 1727204034.50118: Calling all_inventory to load vars for managed-node3 10215 1727204034.50121: Calling groups_inventory to load vars for managed-node3 10215 1727204034.50125: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.50134: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.50138: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.50141: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.50359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.50667: done with get_vars() 10215 1727204034.50678: done getting variables 10215 1727204034.50792: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:53:54 -0400 (0:00:00.019) 0:00:03.075 ***** 10215 1727204034.50828: entering _queue_task() for managed-node3/package 10215 1727204034.50830: Creating lock for package 10215 1727204034.51078: worker is 1 (out of 1 available) 10215 1727204034.51093: exiting _queue_task() for managed-node3/package 10215 1727204034.51105: done queuing things up, now waiting for results queue to drain 10215 1727204034.51110: waiting for pending results... 10215 1727204034.51710: running TaskExecutor() for managed-node3/TASK: Install yum-utils package 10215 1727204034.51717: in run() - task 12b410aa-8751-3c74-8f8e-0000000000fe 10215 1727204034.51721: variable 'ansible_search_path' from source: unknown 10215 1727204034.51725: variable 'ansible_search_path' from source: unknown 10215 1727204034.51728: calling self._execute() 10215 1727204034.51732: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.51735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.51738: variable 'omit' from source: magic vars 10215 1727204034.52106: variable 'ansible_distribution' from source: facts 10215 1727204034.52124: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10215 1727204034.52128: when evaluation is False, skipping this task 10215 1727204034.52131: _execute() done 10215 1727204034.52183: dumping result to json 10215 1727204034.52187: done dumping result, returning 10215 1727204034.52193: done running TaskExecutor() for managed-node3/TASK: Install yum-utils package [12b410aa-8751-3c74-8f8e-0000000000fe] 10215 1727204034.52195: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000fe 10215 1727204034.52263: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000fe 10215 1727204034.52266: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10215 1727204034.52345: no more pending results, returning what we have 10215 1727204034.52349: results queue empty 10215 1727204034.52350: checking for any_errors_fatal 10215 1727204034.52357: done checking for any_errors_fatal 10215 1727204034.52358: checking for max_fail_percentage 10215 1727204034.52360: done checking for max_fail_percentage 10215 1727204034.52361: checking to see if all hosts have failed and the running result is not ok 10215 1727204034.52362: done checking to see if all hosts have failed 10215 1727204034.52363: getting the remaining hosts for this loop 10215 1727204034.52365: done getting the remaining hosts for this loop 10215 1727204034.52370: getting the next task for host managed-node3 10215 1727204034.52377: done getting next task for host managed-node3 10215 1727204034.52380: ^ task is: TASK: Enable EPEL 7 10215 1727204034.52384: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.52387: getting variables 10215 1727204034.52391: in VariableManager get_vars() 10215 1727204034.52426: Calling all_inventory to load vars for managed-node3 10215 1727204034.52430: Calling groups_inventory to load vars for managed-node3 10215 1727204034.52434: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.52448: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.52453: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.52457: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.52780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.53064: done with get_vars() 10215 1727204034.53076: done getting variables 10215 1727204034.53145: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:53:54 -0400 (0:00:00.023) 0:00:03.098 ***** 10215 1727204034.53179: entering _queue_task() for managed-node3/command 10215 1727204034.53446: worker is 1 (out of 1 available) 10215 1727204034.53462: exiting _queue_task() for managed-node3/command 10215 1727204034.53474: done queuing things up, now waiting for results queue to drain 10215 1727204034.53476: waiting for pending results... 10215 1727204034.53911: running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 10215 1727204034.54098: in run() - task 12b410aa-8751-3c74-8f8e-0000000000ff 10215 1727204034.54102: variable 'ansible_search_path' from source: unknown 10215 1727204034.54107: variable 'ansible_search_path' from source: unknown 10215 1727204034.54111: calling self._execute() 10215 1727204034.54124: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.54140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.54157: variable 'omit' from source: magic vars 10215 1727204034.54620: variable 'ansible_distribution' from source: facts 10215 1727204034.54643: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10215 1727204034.54653: when evaluation is False, skipping this task 10215 1727204034.54668: _execute() done 10215 1727204034.54677: dumping result to json 10215 1727204034.54686: done dumping result, returning 10215 1727204034.54701: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 7 [12b410aa-8751-3c74-8f8e-0000000000ff] 10215 1727204034.54713: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000ff skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10215 1727204034.54885: no more pending results, returning what we have 10215 1727204034.54888: results queue empty 10215 1727204034.54994: checking for any_errors_fatal 10215 1727204034.55001: done checking for any_errors_fatal 10215 1727204034.55002: checking for max_fail_percentage 10215 1727204034.55003: done checking for max_fail_percentage 10215 1727204034.55005: checking to see if all hosts have failed and the running result is not ok 10215 1727204034.55006: done checking to see if all hosts have failed 10215 1727204034.55007: getting the remaining hosts for this loop 10215 1727204034.55010: done getting the remaining hosts for this loop 10215 1727204034.55014: getting the next task for host managed-node3 10215 1727204034.55020: done getting next task for host managed-node3 10215 1727204034.55023: ^ task is: TASK: Enable EPEL 8 10215 1727204034.55027: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.55030: getting variables 10215 1727204034.55032: in VariableManager get_vars() 10215 1727204034.55058: Calling all_inventory to load vars for managed-node3 10215 1727204034.55061: Calling groups_inventory to load vars for managed-node3 10215 1727204034.55065: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.55077: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.55080: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.55085: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.55352: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000ff 10215 1727204034.55356: WORKER PROCESS EXITING 10215 1727204034.55373: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.55668: done with get_vars() 10215 1727204034.55679: done getting variables 10215 1727204034.55753: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:53:54 -0400 (0:00:00.026) 0:00:03.125 ***** 10215 1727204034.55791: entering _queue_task() for managed-node3/command 10215 1727204034.56053: worker is 1 (out of 1 available) 10215 1727204034.56066: exiting _queue_task() for managed-node3/command 10215 1727204034.56194: done queuing things up, now waiting for results queue to drain 10215 1727204034.56197: waiting for pending results... 10215 1727204034.56422: running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 10215 1727204034.56506: in run() - task 12b410aa-8751-3c74-8f8e-000000000100 10215 1727204034.56695: variable 'ansible_search_path' from source: unknown 10215 1727204034.56699: variable 'ansible_search_path' from source: unknown 10215 1727204034.56702: calling self._execute() 10215 1727204034.56705: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.56707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.56709: variable 'omit' from source: magic vars 10215 1727204034.57161: variable 'ansible_distribution' from source: facts 10215 1727204034.57181: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10215 1727204034.57192: when evaluation is False, skipping this task 10215 1727204034.57202: _execute() done 10215 1727204034.57211: dumping result to json 10215 1727204034.57220: done dumping result, returning 10215 1727204034.57231: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 8 [12b410aa-8751-3c74-8f8e-000000000100] 10215 1727204034.57242: sending task result for task 12b410aa-8751-3c74-8f8e-000000000100 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10215 1727204034.57421: no more pending results, returning what we have 10215 1727204034.57426: results queue empty 10215 1727204034.57427: checking for any_errors_fatal 10215 1727204034.57432: done checking for any_errors_fatal 10215 1727204034.57433: checking for max_fail_percentage 10215 1727204034.57435: done checking for max_fail_percentage 10215 1727204034.57437: checking to see if all hosts have failed and the running result is not ok 10215 1727204034.57438: done checking to see if all hosts have failed 10215 1727204034.57439: getting the remaining hosts for this loop 10215 1727204034.57440: done getting the remaining hosts for this loop 10215 1727204034.57446: getting the next task for host managed-node3 10215 1727204034.57456: done getting next task for host managed-node3 10215 1727204034.57458: ^ task is: TASK: Enable EPEL 6 10215 1727204034.57463: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.57468: getting variables 10215 1727204034.57470: in VariableManager get_vars() 10215 1727204034.57504: Calling all_inventory to load vars for managed-node3 10215 1727204034.57508: Calling groups_inventory to load vars for managed-node3 10215 1727204034.57513: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.57528: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.57533: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.57538: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.57955: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000100 10215 1727204034.57958: WORKER PROCESS EXITING 10215 1727204034.57985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.58285: done with get_vars() 10215 1727204034.58297: done getting variables 10215 1727204034.58368: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:53:54 -0400 (0:00:00.026) 0:00:03.151 ***** 10215 1727204034.58405: entering _queue_task() for managed-node3/copy 10215 1727204034.58648: worker is 1 (out of 1 available) 10215 1727204034.58660: exiting _queue_task() for managed-node3/copy 10215 1727204034.58671: done queuing things up, now waiting for results queue to drain 10215 1727204034.58673: waiting for pending results... 10215 1727204034.58934: running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 10215 1727204034.59074: in run() - task 12b410aa-8751-3c74-8f8e-000000000102 10215 1727204034.59099: variable 'ansible_search_path' from source: unknown 10215 1727204034.59107: variable 'ansible_search_path' from source: unknown 10215 1727204034.59157: calling self._execute() 10215 1727204034.59247: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.59263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.59281: variable 'omit' from source: magic vars 10215 1727204034.59739: variable 'ansible_distribution' from source: facts 10215 1727204034.59759: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 10215 1727204034.59778: when evaluation is False, skipping this task 10215 1727204034.59790: _execute() done 10215 1727204034.59800: dumping result to json 10215 1727204034.59808: done dumping result, returning 10215 1727204034.59819: done running TaskExecutor() for managed-node3/TASK: Enable EPEL 6 [12b410aa-8751-3c74-8f8e-000000000102] 10215 1727204034.59830: sending task result for task 12b410aa-8751-3c74-8f8e-000000000102 10215 1727204034.59954: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000102 10215 1727204034.59958: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 10215 1727204034.60046: no more pending results, returning what we have 10215 1727204034.60051: results queue empty 10215 1727204034.60052: checking for any_errors_fatal 10215 1727204034.60057: done checking for any_errors_fatal 10215 1727204034.60058: checking for max_fail_percentage 10215 1727204034.60060: done checking for max_fail_percentage 10215 1727204034.60061: checking to see if all hosts have failed and the running result is not ok 10215 1727204034.60062: done checking to see if all hosts have failed 10215 1727204034.60063: getting the remaining hosts for this loop 10215 1727204034.60065: done getting the remaining hosts for this loop 10215 1727204034.60070: getting the next task for host managed-node3 10215 1727204034.60079: done getting next task for host managed-node3 10215 1727204034.60082: ^ task is: TASK: Set network provider to 'nm' 10215 1727204034.60085: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.60091: getting variables 10215 1727204034.60093: in VariableManager get_vars() 10215 1727204034.60237: Calling all_inventory to load vars for managed-node3 10215 1727204034.60240: Calling groups_inventory to load vars for managed-node3 10215 1727204034.60244: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.60255: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.60259: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.60263: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.60585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.60874: done with get_vars() 10215 1727204034.60884: done getting variables 10215 1727204034.60951: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:13 Tuesday 24 September 2024 14:53:54 -0400 (0:00:00.025) 0:00:03.177 ***** 10215 1727204034.60988: entering _queue_task() for managed-node3/set_fact 10215 1727204034.61332: worker is 1 (out of 1 available) 10215 1727204034.61343: exiting _queue_task() for managed-node3/set_fact 10215 1727204034.61354: done queuing things up, now waiting for results queue to drain 10215 1727204034.61356: waiting for pending results... 10215 1727204034.61708: running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' 10215 1727204034.61714: in run() - task 12b410aa-8751-3c74-8f8e-000000000007 10215 1727204034.61717: variable 'ansible_search_path' from source: unknown 10215 1727204034.61720: calling self._execute() 10215 1727204034.61778: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.61798: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.61823: variable 'omit' from source: magic vars 10215 1727204034.61960: variable 'omit' from source: magic vars 10215 1727204034.62008: variable 'omit' from source: magic vars 10215 1727204034.62070: variable 'omit' from source: magic vars 10215 1727204034.62131: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204034.62181: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204034.62213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204034.62345: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204034.62350: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204034.62353: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204034.62356: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.62358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.62463: Set connection var ansible_connection to ssh 10215 1727204034.62482: Set connection var ansible_pipelining to False 10215 1727204034.62498: Set connection var ansible_shell_type to sh 10215 1727204034.62512: Set connection var ansible_timeout to 10 10215 1727204034.62523: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204034.62538: Set connection var ansible_shell_executable to /bin/sh 10215 1727204034.62581: variable 'ansible_shell_executable' from source: unknown 10215 1727204034.62584: variable 'ansible_connection' from source: unknown 10215 1727204034.62672: variable 'ansible_module_compression' from source: unknown 10215 1727204034.62676: variable 'ansible_shell_type' from source: unknown 10215 1727204034.62678: variable 'ansible_shell_executable' from source: unknown 10215 1727204034.62681: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.62686: variable 'ansible_pipelining' from source: unknown 10215 1727204034.62688: variable 'ansible_timeout' from source: unknown 10215 1727204034.62691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.62891: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204034.62898: variable 'omit' from source: magic vars 10215 1727204034.62901: starting attempt loop 10215 1727204034.62904: running the handler 10215 1727204034.62906: handler run complete 10215 1727204034.62911: attempt loop complete, returning result 10215 1727204034.62916: _execute() done 10215 1727204034.62919: dumping result to json 10215 1727204034.62921: done dumping result, returning 10215 1727204034.62928: done running TaskExecutor() for managed-node3/TASK: Set network provider to 'nm' [12b410aa-8751-3c74-8f8e-000000000007] 10215 1727204034.62940: sending task result for task 12b410aa-8751-3c74-8f8e-000000000007 ok: [managed-node3] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 10215 1727204034.63178: no more pending results, returning what we have 10215 1727204034.63182: results queue empty 10215 1727204034.63183: checking for any_errors_fatal 10215 1727204034.63188: done checking for any_errors_fatal 10215 1727204034.63191: checking for max_fail_percentage 10215 1727204034.63193: done checking for max_fail_percentage 10215 1727204034.63195: checking to see if all hosts have failed and the running result is not ok 10215 1727204034.63196: done checking to see if all hosts have failed 10215 1727204034.63197: getting the remaining hosts for this loop 10215 1727204034.63198: done getting the remaining hosts for this loop 10215 1727204034.63203: getting the next task for host managed-node3 10215 1727204034.63211: done getting next task for host managed-node3 10215 1727204034.63213: ^ task is: TASK: meta (flush_handlers) 10215 1727204034.63215: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.63220: getting variables 10215 1727204034.63222: in VariableManager get_vars() 10215 1727204034.63255: Calling all_inventory to load vars for managed-node3 10215 1727204034.63259: Calling groups_inventory to load vars for managed-node3 10215 1727204034.63263: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.63276: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.63281: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.63285: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.63414: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000007 10215 1727204034.63418: WORKER PROCESS EXITING 10215 1727204034.63653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.63938: done with get_vars() 10215 1727204034.63950: done getting variables 10215 1727204034.64027: in VariableManager get_vars() 10215 1727204034.64037: Calling all_inventory to load vars for managed-node3 10215 1727204034.64045: Calling groups_inventory to load vars for managed-node3 10215 1727204034.64049: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.64054: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.64058: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.64062: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.64491: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.64761: done with get_vars() 10215 1727204034.64776: done queuing things up, now waiting for results queue to drain 10215 1727204034.64778: results queue empty 10215 1727204034.64779: checking for any_errors_fatal 10215 1727204034.64782: done checking for any_errors_fatal 10215 1727204034.64783: checking for max_fail_percentage 10215 1727204034.64784: done checking for max_fail_percentage 10215 1727204034.64785: checking to see if all hosts have failed and the running result is not ok 10215 1727204034.64786: done checking to see if all hosts have failed 10215 1727204034.64787: getting the remaining hosts for this loop 10215 1727204034.64788: done getting the remaining hosts for this loop 10215 1727204034.64793: getting the next task for host managed-node3 10215 1727204034.64798: done getting next task for host managed-node3 10215 1727204034.64800: ^ task is: TASK: meta (flush_handlers) 10215 1727204034.64801: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.64815: getting variables 10215 1727204034.64817: in VariableManager get_vars() 10215 1727204034.64827: Calling all_inventory to load vars for managed-node3 10215 1727204034.64829: Calling groups_inventory to load vars for managed-node3 10215 1727204034.64832: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.64838: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.64841: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.64845: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.65047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.65322: done with get_vars() 10215 1727204034.65331: done getting variables 10215 1727204034.65393: in VariableManager get_vars() 10215 1727204034.65402: Calling all_inventory to load vars for managed-node3 10215 1727204034.65405: Calling groups_inventory to load vars for managed-node3 10215 1727204034.65408: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.65413: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.65417: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.65420: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.65640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.65918: done with get_vars() 10215 1727204034.65932: done queuing things up, now waiting for results queue to drain 10215 1727204034.65934: results queue empty 10215 1727204034.65935: checking for any_errors_fatal 10215 1727204034.65936: done checking for any_errors_fatal 10215 1727204034.65937: checking for max_fail_percentage 10215 1727204034.65938: done checking for max_fail_percentage 10215 1727204034.65940: checking to see if all hosts have failed and the running result is not ok 10215 1727204034.65941: done checking to see if all hosts have failed 10215 1727204034.65941: getting the remaining hosts for this loop 10215 1727204034.65942: done getting the remaining hosts for this loop 10215 1727204034.65945: getting the next task for host managed-node3 10215 1727204034.65948: done getting next task for host managed-node3 10215 1727204034.65949: ^ task is: None 10215 1727204034.65951: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.65952: done queuing things up, now waiting for results queue to drain 10215 1727204034.65953: results queue empty 10215 1727204034.65954: checking for any_errors_fatal 10215 1727204034.65955: done checking for any_errors_fatal 10215 1727204034.65956: checking for max_fail_percentage 10215 1727204034.65957: done checking for max_fail_percentage 10215 1727204034.65958: checking to see if all hosts have failed and the running result is not ok 10215 1727204034.65959: done checking to see if all hosts have failed 10215 1727204034.65960: getting the next task for host managed-node3 10215 1727204034.65963: done getting next task for host managed-node3 10215 1727204034.65964: ^ task is: None 10215 1727204034.65966: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.66015: in VariableManager get_vars() 10215 1727204034.66042: done with get_vars() 10215 1727204034.66049: in VariableManager get_vars() 10215 1727204034.66066: done with get_vars() 10215 1727204034.66071: variable 'omit' from source: magic vars 10215 1727204034.66109: in VariableManager get_vars() 10215 1727204034.66134: done with get_vars() 10215 1727204034.66161: variable 'omit' from source: magic vars PLAY [Play for testing bond connection] **************************************** 10215 1727204034.67142: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 10215 1727204034.67167: getting the remaining hosts for this loop 10215 1727204034.67169: done getting the remaining hosts for this loop 10215 1727204034.67172: getting the next task for host managed-node3 10215 1727204034.67176: done getting next task for host managed-node3 10215 1727204034.67178: ^ task is: TASK: Gathering Facts 10215 1727204034.67180: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204034.67182: getting variables 10215 1727204034.67183: in VariableManager get_vars() 10215 1727204034.67201: Calling all_inventory to load vars for managed-node3 10215 1727204034.67203: Calling groups_inventory to load vars for managed-node3 10215 1727204034.67206: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204034.67217: Calling all_plugins_play to load vars for managed-node3 10215 1727204034.67234: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204034.67238: Calling groups_plugins_play to load vars for managed-node3 10215 1727204034.67441: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204034.67722: done with get_vars() 10215 1727204034.67731: done getting variables 10215 1727204034.67784: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 Tuesday 24 September 2024 14:53:54 -0400 (0:00:00.068) 0:00:03.245 ***** 10215 1727204034.67814: entering _queue_task() for managed-node3/gather_facts 10215 1727204034.68080: worker is 1 (out of 1 available) 10215 1727204034.68203: exiting _queue_task() for managed-node3/gather_facts 10215 1727204034.68215: done queuing things up, now waiting for results queue to drain 10215 1727204034.68217: waiting for pending results... 10215 1727204034.68377: running TaskExecutor() for managed-node3/TASK: Gathering Facts 10215 1727204034.68527: in run() - task 12b410aa-8751-3c74-8f8e-000000000128 10215 1727204034.68531: variable 'ansible_search_path' from source: unknown 10215 1727204034.68568: calling self._execute() 10215 1727204034.68694: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.68697: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.68700: variable 'omit' from source: magic vars 10215 1727204034.69148: variable 'ansible_distribution_major_version' from source: facts 10215 1727204034.69167: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204034.69184: variable 'omit' from source: magic vars 10215 1727204034.69293: variable 'omit' from source: magic vars 10215 1727204034.69298: variable 'omit' from source: magic vars 10215 1727204034.69327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204034.69375: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204034.69411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204034.69440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204034.69459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204034.69506: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204034.69519: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.69617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.69671: Set connection var ansible_connection to ssh 10215 1727204034.69686: Set connection var ansible_pipelining to False 10215 1727204034.69702: Set connection var ansible_shell_type to sh 10215 1727204034.69716: Set connection var ansible_timeout to 10 10215 1727204034.69740: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204034.69755: Set connection var ansible_shell_executable to /bin/sh 10215 1727204034.69784: variable 'ansible_shell_executable' from source: unknown 10215 1727204034.69796: variable 'ansible_connection' from source: unknown 10215 1727204034.69805: variable 'ansible_module_compression' from source: unknown 10215 1727204034.69814: variable 'ansible_shell_type' from source: unknown 10215 1727204034.69835: variable 'ansible_shell_executable' from source: unknown 10215 1727204034.69939: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204034.69945: variable 'ansible_pipelining' from source: unknown 10215 1727204034.69947: variable 'ansible_timeout' from source: unknown 10215 1727204034.69950: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204034.70092: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204034.70114: variable 'omit' from source: magic vars 10215 1727204034.70125: starting attempt loop 10215 1727204034.70133: running the handler 10215 1727204034.70155: variable 'ansible_facts' from source: unknown 10215 1727204034.70285: _low_level_execute_command(): starting 10215 1727204034.70291: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204034.71011: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204034.71068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204034.71085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204034.71115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204034.71186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204034.72962: stdout chunk (state=3): >>>/root <<< 10215 1727204034.73174: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204034.73178: stdout chunk (state=3): >>><<< 10215 1727204034.73181: stderr chunk (state=3): >>><<< 10215 1727204034.73207: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204034.73327: _low_level_execute_command(): starting 10215 1727204034.73332: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330 `" && echo ansible-tmp-1727204034.7321844-10592-142399051243330="` echo /root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330 `" ) && sleep 0' 10215 1727204034.73936: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204034.73952: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204034.73969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204034.74013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204034.74126: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204034.74155: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204034.74232: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204034.76204: stdout chunk (state=3): >>>ansible-tmp-1727204034.7321844-10592-142399051243330=/root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330 <<< 10215 1727204034.76405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204034.76412: stdout chunk (state=3): >>><<< 10215 1727204034.76415: stderr chunk (state=3): >>><<< 10215 1727204034.76435: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204034.7321844-10592-142399051243330=/root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204034.76594: variable 'ansible_module_compression' from source: unknown 10215 1727204034.76598: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 10215 1727204034.76601: variable 'ansible_facts' from source: unknown 10215 1727204034.76815: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330/AnsiballZ_setup.py 10215 1727204034.77134: Sending initial data 10215 1727204034.77164: Sent initial data (154 bytes) 10215 1727204034.77771: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204034.77826: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204034.77906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204034.77948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204034.77964: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204034.77987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204034.78060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204034.79726: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204034.79746: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204034.79799: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpmwuc0s0i /root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330/AnsiballZ_setup.py <<< 10215 1727204034.79803: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330/AnsiballZ_setup.py" <<< 10215 1727204034.79906: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpmwuc0s0i" to remote "/root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330/AnsiballZ_setup.py" <<< 10215 1727204034.82303: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204034.82344: stderr chunk (state=3): >>><<< 10215 1727204034.82355: stdout chunk (state=3): >>><<< 10215 1727204034.82394: done transferring module to remote 10215 1727204034.82413: _low_level_execute_command(): starting 10215 1727204034.82438: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330/ /root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330/AnsiballZ_setup.py && sleep 0' 10215 1727204034.83197: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10215 1727204034.83219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204034.83239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204034.83312: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204034.85213: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204034.85229: stdout chunk (state=3): >>><<< 10215 1727204034.85241: stderr chunk (state=3): >>><<< 10215 1727204034.85349: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204034.85357: _low_level_execute_command(): starting 10215 1727204034.85360: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330/AnsiballZ_setup.py && sleep 0' 10215 1727204034.85950: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204034.86007: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204034.86077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204034.86098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204034.86134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204034.86209: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204035.54599: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2882, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 835, "free": 2882}, "nocache": {"free": 3488, "used": 229}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 539, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251165011968, "block_size": 4096, "block_total": 64479564, "block_available": 61319583, "block_used": 3159981, "inode_total": 16384000, "inode_available": 16302335, "inode_used": 81665, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_hostnqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "55", "epoch": "1727204035", "epoch_int": "1727204035", "date": "2024-09-24", "time": "14:53:55", "iso8601_micro": "2024-09-24T18:53:55.511390Z", "iso8601": "2024-09-24T18:53:55Z", "iso8601_basic": "20240924T145355511390", "iso8601_basic_short": "20240924T145355", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.4677734375, "5m": 0.390625, "15m": 0.224609375}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 10215 1727204035.56518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204035.56733: stderr chunk (state=3): >>><<< 10215 1727204035.56737: stdout chunk (state=3): >>><<< 10215 1727204035.56776: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAI5YZQ7OH6eqgmanrwxkUl16pMvE2q26X32NofYRKBzF04m84VIsiCBP80rN+sGEKnRhTwlxJwcSfAyscmxkynk8ozeR0SaMEECkbOjee1DqGR1yz8VSKEIk2gZ+ImYscF6c32jGvz1w/gz9baswEs+v92Ljqv3+V3s8foVkwWM1AAAAFQDApo03iAyJzp9y7AillVl9LpN8rwAAAIBNHNvfLLH/rvWMdavYWGiljarx5Z8cDKFv4QiliuY2AenrQ5mjBN3ZJZuDpmwC9vuoPM+TWxp9pbrnVJy4VM6iS8c/Lr9I982fUD4neMvJEywdnYtsRhezGMCk57/Npw91h6EKhcAYiaFF53jl540WIjTvu2bEA8Hgb11YGH+isAAAAIAkremps+61DEFeDWQjRHbf8fZzhmpUYduU+sHRW5usa/1cOOeeN/8XBHfMST6TPedAY/6t7Oxda9D2mq6mo2Rl9arSQWcBypqwvzRiz0LGnRnElGtXKJALy6vYKG7xi+29ZmqlBvD14cB7/wSZqZP9MkRj3+QzQJLvNnuGRyLguA==", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDBj8PEtqglWtlJ3r3hgP2TELjSd8JOOpjIitLlWjKdUao5ePB6PWTf9MZV0rLZr0re7hAS1EWeexARYQakyETmyOoPmRCaD5vvrfN3AJJ6I+O2EhApLpYrEORJbTfrme6AoCGmxQG8tR7j3YpVOvePZ65ka7FDUWnNLI0DWpyDURAKmvOxtiOcYazpmB7GJ/5ycpEAV7KGp7tEQ9MNIAbSaYTBXVBNa5V2HyEmcabs+/Qy/jp8OWy+Tl3uCUV0SmFplVCKib9Kp3eEMZd5udXsYnmUYtLNMJQkQOzTdol5AozustkdBnasVn/RSnQpWQMBrrUQMxchNOb8FDAuH6AONEVJl9mHY6mk3zfkkyPZE6sIrMIj0B48xTWzMIjC+N9SN7DRRUWzjYIqkL5fsYu0fkkGuZeNvyJRlv8h7oFWA7YtvNHdNYf41mkXryERg8V3zI0aZcmQul6XTOxywwd4b5sudMIng09hfyPOKtnYi6DIN2h5FxOWlvBEbLlcd2U=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPUvqdp1GSRMDwSqfOZO1hLGpDfzy51B9cIhTK2AWy7qlUXPaSlJ0jc31uj+CW3SnUW36VSKRHdj9R9hJev9Zic=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIFL7RdA+aCgUcBhcJBLwti3mnwduhYXxSw8RlI3Cvebm", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node3", "ansible_hostname": "managed-node3", "ansible_nodename": "managed-node3", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec272ed147e29e35f2e68cd6465c5ec1", "ansible_lsb": {}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50414 10.31.10.90 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "6", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50414 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2882, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 835, "free": 2882}, "nocache": {"free": 3488, "used": 229}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_uuid": "ec272ed1-47e2-9e35-f2e6-8cd6465c5ec1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 539, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251165011968, "block_size": 4096, "block_total": 64479564, "block_available": 61319583, "block_used": 3159981, "inode_total": 16384000, "inode_available": 16302335, "inode_used": 81665, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_hostnqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "55", "epoch": "1727204035", "epoch_int": "1727204035", "date": "2024-09-24", "time": "14:53:55", "iso8601_micro": "2024-09-24T18:53:55.511390Z", "iso8601": "2024-09-24T18:53:55Z", "iso8601_basic": "20240924T145355511390", "iso8601_basic_short": "20240924T145355", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_loadavg": {"1m": 0.4677734375, "5m": 0.390625, "15m": 0.224609375}, "ansible_pkg_mgr": "dnf", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::37d3:4e93:30d:de94", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.10.90", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:5e:c8:16:36:1d", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.10.90"], "ansible_all_ipv6_addresses": ["fe80::37d3:4e93:30d:de94"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.10.90", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::37d3:4e93:30d:de94"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204035.57695: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204035.57699: _low_level_execute_command(): starting 10215 1727204035.57702: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204034.7321844-10592-142399051243330/ > /dev/null 2>&1 && sleep 0' 10215 1727204035.59322: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204035.59542: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204035.59567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204035.59638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204035.61652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204035.61662: stdout chunk (state=3): >>><<< 10215 1727204035.61673: stderr chunk (state=3): >>><<< 10215 1727204035.61694: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204035.61710: handler run complete 10215 1727204035.62024: variable 'ansible_facts' from source: unknown 10215 1727204035.62513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204035.63865: variable 'ansible_facts' from source: unknown 10215 1727204035.64105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204035.64531: attempt loop complete, returning result 10215 1727204035.64542: _execute() done 10215 1727204035.64550: dumping result to json 10215 1727204035.64586: done dumping result, returning 10215 1727204035.64628: done running TaskExecutor() for managed-node3/TASK: Gathering Facts [12b410aa-8751-3c74-8f8e-000000000128] 10215 1727204035.64724: sending task result for task 12b410aa-8751-3c74-8f8e-000000000128 10215 1727204035.65806: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000128 10215 1727204035.65813: WORKER PROCESS EXITING ok: [managed-node3] 10215 1727204035.66469: no more pending results, returning what we have 10215 1727204035.66473: results queue empty 10215 1727204035.66474: checking for any_errors_fatal 10215 1727204035.66476: done checking for any_errors_fatal 10215 1727204035.66477: checking for max_fail_percentage 10215 1727204035.66478: done checking for max_fail_percentage 10215 1727204035.66479: checking to see if all hosts have failed and the running result is not ok 10215 1727204035.66481: done checking to see if all hosts have failed 10215 1727204035.66482: getting the remaining hosts for this loop 10215 1727204035.66483: done getting the remaining hosts for this loop 10215 1727204035.66488: getting the next task for host managed-node3 10215 1727204035.66698: done getting next task for host managed-node3 10215 1727204035.66701: ^ task is: TASK: meta (flush_handlers) 10215 1727204035.66703: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204035.66710: getting variables 10215 1727204035.66712: in VariableManager get_vars() 10215 1727204035.66749: Calling all_inventory to load vars for managed-node3 10215 1727204035.66753: Calling groups_inventory to load vars for managed-node3 10215 1727204035.66756: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204035.66769: Calling all_plugins_play to load vars for managed-node3 10215 1727204035.66773: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204035.66777: Calling groups_plugins_play to load vars for managed-node3 10215 1727204035.67204: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204035.68439: done with get_vars() 10215 1727204035.68452: done getting variables 10215 1727204035.68539: in VariableManager get_vars() 10215 1727204035.68557: Calling all_inventory to load vars for managed-node3 10215 1727204035.68560: Calling groups_inventory to load vars for managed-node3 10215 1727204035.68563: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204035.68569: Calling all_plugins_play to load vars for managed-node3 10215 1727204035.68572: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204035.68576: Calling groups_plugins_play to load vars for managed-node3 10215 1727204035.69169: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204035.70105: done with get_vars() 10215 1727204035.70126: done queuing things up, now waiting for results queue to drain 10215 1727204035.70128: results queue empty 10215 1727204035.70129: checking for any_errors_fatal 10215 1727204035.70135: done checking for any_errors_fatal 10215 1727204035.70136: checking for max_fail_percentage 10215 1727204035.70137: done checking for max_fail_percentage 10215 1727204035.70138: checking to see if all hosts have failed and the running result is not ok 10215 1727204035.70139: done checking to see if all hosts have failed 10215 1727204035.70140: getting the remaining hosts for this loop 10215 1727204035.70147: done getting the remaining hosts for this loop 10215 1727204035.70150: getting the next task for host managed-node3 10215 1727204035.70155: done getting next task for host managed-node3 10215 1727204035.70158: ^ task is: TASK: INIT Prepare setup 10215 1727204035.70160: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204035.70163: getting variables 10215 1727204035.70164: in VariableManager get_vars() 10215 1727204035.70182: Calling all_inventory to load vars for managed-node3 10215 1727204035.70184: Calling groups_inventory to load vars for managed-node3 10215 1727204035.70187: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204035.70593: Calling all_plugins_play to load vars for managed-node3 10215 1727204035.70598: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204035.70603: Calling groups_plugins_play to load vars for managed-node3 10215 1727204035.70901: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204035.71527: done with get_vars() 10215 1727204035.71540: done getting variables 10215 1727204035.71638: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:15 Tuesday 24 September 2024 14:53:55 -0400 (0:00:01.038) 0:00:04.283 ***** 10215 1727204035.71670: entering _queue_task() for managed-node3/debug 10215 1727204035.71673: Creating lock for debug 10215 1727204035.72626: worker is 1 (out of 1 available) 10215 1727204035.72641: exiting _queue_task() for managed-node3/debug 10215 1727204035.72894: done queuing things up, now waiting for results queue to drain 10215 1727204035.72900: waiting for pending results... 10215 1727204035.73229: running TaskExecutor() for managed-node3/TASK: INIT Prepare setup 10215 1727204035.73450: in run() - task 12b410aa-8751-3c74-8f8e-00000000000b 10215 1727204035.73596: variable 'ansible_search_path' from source: unknown 10215 1727204035.73606: calling self._execute() 10215 1727204035.73786: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204035.73895: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204035.73901: variable 'omit' from source: magic vars 10215 1727204035.75100: variable 'ansible_distribution_major_version' from source: facts 10215 1727204035.75124: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204035.75400: variable 'omit' from source: magic vars 10215 1727204035.75403: variable 'omit' from source: magic vars 10215 1727204035.75406: variable 'omit' from source: magic vars 10215 1727204035.75436: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204035.75482: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204035.75553: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204035.75657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204035.75676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204035.75723: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204035.75851: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204035.75854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204035.76055: Set connection var ansible_connection to ssh 10215 1727204035.76191: Set connection var ansible_pipelining to False 10215 1727204035.76394: Set connection var ansible_shell_type to sh 10215 1727204035.76397: Set connection var ansible_timeout to 10 10215 1727204035.76400: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204035.76403: Set connection var ansible_shell_executable to /bin/sh 10215 1727204035.76406: variable 'ansible_shell_executable' from source: unknown 10215 1727204035.76411: variable 'ansible_connection' from source: unknown 10215 1727204035.76413: variable 'ansible_module_compression' from source: unknown 10215 1727204035.76415: variable 'ansible_shell_type' from source: unknown 10215 1727204035.76417: variable 'ansible_shell_executable' from source: unknown 10215 1727204035.76419: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204035.76421: variable 'ansible_pipelining' from source: unknown 10215 1727204035.76424: variable 'ansible_timeout' from source: unknown 10215 1727204035.76426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204035.76868: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204035.76872: variable 'omit' from source: magic vars 10215 1727204035.76875: starting attempt loop 10215 1727204035.76877: running the handler 10215 1727204035.76879: handler run complete 10215 1727204035.77001: attempt loop complete, returning result 10215 1727204035.77012: _execute() done 10215 1727204035.77021: dumping result to json 10215 1727204035.77028: done dumping result, returning 10215 1727204035.77041: done running TaskExecutor() for managed-node3/TASK: INIT Prepare setup [12b410aa-8751-3c74-8f8e-00000000000b] 10215 1727204035.77053: sending task result for task 12b410aa-8751-3c74-8f8e-00000000000b ok: [managed-node3] => {} MSG: ################################################## 10215 1727204035.77350: no more pending results, returning what we have 10215 1727204035.77354: results queue empty 10215 1727204035.77355: checking for any_errors_fatal 10215 1727204035.77358: done checking for any_errors_fatal 10215 1727204035.77359: checking for max_fail_percentage 10215 1727204035.77360: done checking for max_fail_percentage 10215 1727204035.77362: checking to see if all hosts have failed and the running result is not ok 10215 1727204035.77363: done checking to see if all hosts have failed 10215 1727204035.77364: getting the remaining hosts for this loop 10215 1727204035.77366: done getting the remaining hosts for this loop 10215 1727204035.77371: getting the next task for host managed-node3 10215 1727204035.77379: done getting next task for host managed-node3 10215 1727204035.77382: ^ task is: TASK: Install dnsmasq 10215 1727204035.77386: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204035.77392: getting variables 10215 1727204035.77394: in VariableManager get_vars() 10215 1727204035.77916: Calling all_inventory to load vars for managed-node3 10215 1727204035.77919: Calling groups_inventory to load vars for managed-node3 10215 1727204035.77922: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204035.77934: Calling all_plugins_play to load vars for managed-node3 10215 1727204035.77938: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204035.77943: Calling groups_plugins_play to load vars for managed-node3 10215 1727204035.78369: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000000b 10215 1727204035.78374: WORKER PROCESS EXITING 10215 1727204035.78393: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204035.78881: done with get_vars() 10215 1727204035.78895: done getting variables 10215 1727204035.78963: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:53:55 -0400 (0:00:00.075) 0:00:04.359 ***** 10215 1727204035.79204: entering _queue_task() for managed-node3/package 10215 1727204035.79696: worker is 1 (out of 1 available) 10215 1727204035.79710: exiting _queue_task() for managed-node3/package 10215 1727204035.79723: done queuing things up, now waiting for results queue to drain 10215 1727204035.79725: waiting for pending results... 10215 1727204035.80207: running TaskExecutor() for managed-node3/TASK: Install dnsmasq 10215 1727204035.80514: in run() - task 12b410aa-8751-3c74-8f8e-00000000000f 10215 1727204035.80537: variable 'ansible_search_path' from source: unknown 10215 1727204035.80548: variable 'ansible_search_path' from source: unknown 10215 1727204035.80598: calling self._execute() 10215 1727204035.80822: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204035.80899: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204035.80916: variable 'omit' from source: magic vars 10215 1727204035.81718: variable 'ansible_distribution_major_version' from source: facts 10215 1727204035.81994: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204035.81999: variable 'omit' from source: magic vars 10215 1727204035.82002: variable 'omit' from source: magic vars 10215 1727204035.82424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204035.85780: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204035.85871: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204035.85929: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204035.85981: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204035.86019: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204035.86150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204035.86183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204035.86220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204035.86595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204035.86599: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204035.86602: variable '__network_is_ostree' from source: set_fact 10215 1727204035.86605: variable 'omit' from source: magic vars 10215 1727204035.86607: variable 'omit' from source: magic vars 10215 1727204035.86610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204035.86615: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204035.86636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204035.86708: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204035.86724: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204035.86812: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204035.86816: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204035.86822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204035.87006: Set connection var ansible_connection to ssh 10215 1727204035.87024: Set connection var ansible_pipelining to False 10215 1727204035.87037: Set connection var ansible_shell_type to sh 10215 1727204035.87046: Set connection var ansible_timeout to 10 10215 1727204035.87053: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204035.87064: Set connection var ansible_shell_executable to /bin/sh 10215 1727204035.87093: variable 'ansible_shell_executable' from source: unknown 10215 1727204035.87097: variable 'ansible_connection' from source: unknown 10215 1727204035.87102: variable 'ansible_module_compression' from source: unknown 10215 1727204035.87105: variable 'ansible_shell_type' from source: unknown 10215 1727204035.87113: variable 'ansible_shell_executable' from source: unknown 10215 1727204035.87116: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204035.87130: variable 'ansible_pipelining' from source: unknown 10215 1727204035.87140: variable 'ansible_timeout' from source: unknown 10215 1727204035.87146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204035.87282: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204035.87294: variable 'omit' from source: magic vars 10215 1727204035.87301: starting attempt loop 10215 1727204035.87304: running the handler 10215 1727204035.87316: variable 'ansible_facts' from source: unknown 10215 1727204035.87319: variable 'ansible_facts' from source: unknown 10215 1727204035.87372: _low_level_execute_command(): starting 10215 1727204035.87381: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204035.88208: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204035.88269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204035.88292: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204035.88341: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204035.88392: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204035.90169: stdout chunk (state=3): >>>/root <<< 10215 1727204035.90320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204035.90387: stderr chunk (state=3): >>><<< 10215 1727204035.90424: stdout chunk (state=3): >>><<< 10215 1727204035.90563: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204035.90574: _low_level_execute_command(): starting 10215 1727204035.90577: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429 `" && echo ansible-tmp-1727204035.9045124-10703-268635129490429="` echo /root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429 `" ) && sleep 0' 10215 1727204035.91358: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204035.91433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204035.91468: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204035.91485: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204035.91672: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204035.93612: stdout chunk (state=3): >>>ansible-tmp-1727204035.9045124-10703-268635129490429=/root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429 <<< 10215 1727204035.93809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204035.93814: stdout chunk (state=3): >>><<< 10215 1727204035.93820: stderr chunk (state=3): >>><<< 10215 1727204035.93849: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204035.9045124-10703-268635129490429=/root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204035.93890: variable 'ansible_module_compression' from source: unknown 10215 1727204035.93968: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 10215 1727204035.93972: ANSIBALLZ: Acquiring lock 10215 1727204035.93974: ANSIBALLZ: Lock acquired: 139878728192448 10215 1727204035.93977: ANSIBALLZ: Creating module 10215 1727204036.32681: ANSIBALLZ: Writing module into payload 10215 1727204036.33213: ANSIBALLZ: Writing module 10215 1727204036.33217: ANSIBALLZ: Renaming module 10215 1727204036.33220: ANSIBALLZ: Done creating module 10215 1727204036.33238: variable 'ansible_facts' from source: unknown 10215 1727204036.33352: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429/AnsiballZ_dnf.py 10215 1727204036.33721: Sending initial data 10215 1727204036.33724: Sent initial data (152 bytes) 10215 1727204036.34650: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204036.34698: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204036.34777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204036.36498: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204036.36561: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204036.36624: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmp5ripzfwe /root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429/AnsiballZ_dnf.py <<< 10215 1727204036.36669: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429/AnsiballZ_dnf.py" <<< 10215 1727204036.36673: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmp5ripzfwe" to remote "/root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429/AnsiballZ_dnf.py" <<< 10215 1727204036.38398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204036.38441: stderr chunk (state=3): >>><<< 10215 1727204036.38444: stdout chunk (state=3): >>><<< 10215 1727204036.38807: done transferring module to remote 10215 1727204036.38815: _low_level_execute_command(): starting 10215 1727204036.38819: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429/ /root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429/AnsiballZ_dnf.py && sleep 0' 10215 1727204036.39599: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204036.39603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204036.39704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204036.39746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204036.39769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204036.39785: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204036.39931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204036.41880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204036.41893: stdout chunk (state=3): >>><<< 10215 1727204036.42028: stderr chunk (state=3): >>><<< 10215 1727204036.42038: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204036.42041: _low_level_execute_command(): starting 10215 1727204036.42044: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429/AnsiballZ_dnf.py && sleep 0' 10215 1727204036.43111: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204036.43136: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204036.43152: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204036.43244: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204036.43412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204037.89658: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 10215 1727204037.94564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204037.94577: stdout chunk (state=3): >>><<< 10215 1727204037.94596: stderr chunk (state=3): >>><<< 10215 1727204037.94625: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204037.94694: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204037.94719: _low_level_execute_command(): starting 10215 1727204037.94733: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204035.9045124-10703-268635129490429/ > /dev/null 2>&1 && sleep 0' 10215 1727204037.95411: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204037.95429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204037.95445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204037.95498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204037.95574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204037.95610: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204037.95710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204037.97555: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204037.97610: stderr chunk (state=3): >>><<< 10215 1727204037.97613: stdout chunk (state=3): >>><<< 10215 1727204037.97696: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204037.97700: handler run complete 10215 1727204037.97779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204037.97935: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204037.97971: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204037.98000: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204037.98030: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204037.98091: variable '__install_status' from source: unknown 10215 1727204037.98109: Evaluated conditional (__install_status is success): True 10215 1727204037.98126: attempt loop complete, returning result 10215 1727204037.98129: _execute() done 10215 1727204037.98133: dumping result to json 10215 1727204037.98141: done dumping result, returning 10215 1727204037.98152: done running TaskExecutor() for managed-node3/TASK: Install dnsmasq [12b410aa-8751-3c74-8f8e-00000000000f] 10215 1727204037.98160: sending task result for task 12b410aa-8751-3c74-8f8e-00000000000f 10215 1727204037.98262: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000000f 10215 1727204037.98265: WORKER PROCESS EXITING ok: [managed-node3] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 10215 1727204037.98377: no more pending results, returning what we have 10215 1727204037.98381: results queue empty 10215 1727204037.98382: checking for any_errors_fatal 10215 1727204037.98398: done checking for any_errors_fatal 10215 1727204037.98400: checking for max_fail_percentage 10215 1727204037.98402: done checking for max_fail_percentage 10215 1727204037.98403: checking to see if all hosts have failed and the running result is not ok 10215 1727204037.98404: done checking to see if all hosts have failed 10215 1727204037.98405: getting the remaining hosts for this loop 10215 1727204037.98407: done getting the remaining hosts for this loop 10215 1727204037.98411: getting the next task for host managed-node3 10215 1727204037.98418: done getting next task for host managed-node3 10215 1727204037.98420: ^ task is: TASK: Install pgrep, sysctl 10215 1727204037.98423: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204037.98427: getting variables 10215 1727204037.98428: in VariableManager get_vars() 10215 1727204037.98468: Calling all_inventory to load vars for managed-node3 10215 1727204037.98471: Calling groups_inventory to load vars for managed-node3 10215 1727204037.98473: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204037.98484: Calling all_plugins_play to load vars for managed-node3 10215 1727204037.98487: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204037.98493: Calling groups_plugins_play to load vars for managed-node3 10215 1727204037.98692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204037.98875: done with get_vars() 10215 1727204037.98886: done getting variables 10215 1727204037.98971: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:53:57 -0400 (0:00:02.198) 0:00:06.557 ***** 10215 1727204037.99012: entering _queue_task() for managed-node3/package 10215 1727204037.99342: worker is 1 (out of 1 available) 10215 1727204037.99354: exiting _queue_task() for managed-node3/package 10215 1727204037.99366: done queuing things up, now waiting for results queue to drain 10215 1727204037.99368: waiting for pending results... 10215 1727204037.99769: running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl 10215 1727204037.99828: in run() - task 12b410aa-8751-3c74-8f8e-000000000010 10215 1727204037.99855: variable 'ansible_search_path' from source: unknown 10215 1727204037.99866: variable 'ansible_search_path' from source: unknown 10215 1727204037.99925: calling self._execute() 10215 1727204037.99995: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204038.00005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204038.00016: variable 'omit' from source: magic vars 10215 1727204038.00338: variable 'ansible_distribution_major_version' from source: facts 10215 1727204038.00350: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204038.00452: variable 'ansible_os_family' from source: facts 10215 1727204038.00459: Evaluated conditional (ansible_os_family == 'RedHat'): True 10215 1727204038.00605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204038.00831: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204038.00873: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204038.00906: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204038.00938: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204038.01005: variable 'ansible_distribution_major_version' from source: facts 10215 1727204038.01019: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 10215 1727204038.01022: when evaluation is False, skipping this task 10215 1727204038.01025: _execute() done 10215 1727204038.01028: dumping result to json 10215 1727204038.01032: done dumping result, returning 10215 1727204038.01039: done running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl [12b410aa-8751-3c74-8f8e-000000000010] 10215 1727204038.01045: sending task result for task 12b410aa-8751-3c74-8f8e-000000000010 10215 1727204038.01141: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000010 10215 1727204038.01144: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 10215 1727204038.01201: no more pending results, returning what we have 10215 1727204038.01205: results queue empty 10215 1727204038.01207: checking for any_errors_fatal 10215 1727204038.01213: done checking for any_errors_fatal 10215 1727204038.01214: checking for max_fail_percentage 10215 1727204038.01216: done checking for max_fail_percentage 10215 1727204038.01217: checking to see if all hosts have failed and the running result is not ok 10215 1727204038.01218: done checking to see if all hosts have failed 10215 1727204038.01218: getting the remaining hosts for this loop 10215 1727204038.01220: done getting the remaining hosts for this loop 10215 1727204038.01224: getting the next task for host managed-node3 10215 1727204038.01229: done getting next task for host managed-node3 10215 1727204038.01231: ^ task is: TASK: Install pgrep, sysctl 10215 1727204038.01234: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204038.01238: getting variables 10215 1727204038.01239: in VariableManager get_vars() 10215 1727204038.01276: Calling all_inventory to load vars for managed-node3 10215 1727204038.01279: Calling groups_inventory to load vars for managed-node3 10215 1727204038.01282: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204038.01294: Calling all_plugins_play to load vars for managed-node3 10215 1727204038.01297: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204038.01301: Calling groups_plugins_play to load vars for managed-node3 10215 1727204038.01444: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204038.01607: done with get_vars() 10215 1727204038.01617: done getting variables 10215 1727204038.01664: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:53:58 -0400 (0:00:00.026) 0:00:06.584 ***** 10215 1727204038.01687: entering _queue_task() for managed-node3/package 10215 1727204038.01880: worker is 1 (out of 1 available) 10215 1727204038.01895: exiting _queue_task() for managed-node3/package 10215 1727204038.01906: done queuing things up, now waiting for results queue to drain 10215 1727204038.01908: waiting for pending results... 10215 1727204038.02062: running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl 10215 1727204038.02150: in run() - task 12b410aa-8751-3c74-8f8e-000000000011 10215 1727204038.02154: variable 'ansible_search_path' from source: unknown 10215 1727204038.02158: variable 'ansible_search_path' from source: unknown 10215 1727204038.02186: calling self._execute() 10215 1727204038.02253: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204038.02258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204038.02270: variable 'omit' from source: magic vars 10215 1727204038.02562: variable 'ansible_distribution_major_version' from source: facts 10215 1727204038.02575: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204038.02672: variable 'ansible_os_family' from source: facts 10215 1727204038.02678: Evaluated conditional (ansible_os_family == 'RedHat'): True 10215 1727204038.02832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204038.03152: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204038.03395: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204038.03399: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204038.03402: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204038.03404: variable 'ansible_distribution_major_version' from source: facts 10215 1727204038.03410: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 10215 1727204038.03416: variable 'omit' from source: magic vars 10215 1727204038.03465: variable 'omit' from source: magic vars 10215 1727204038.03665: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204038.05944: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204038.06027: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204038.06073: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204038.06121: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204038.06156: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204038.06269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204038.06315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204038.06353: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204038.06415: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204038.06437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204038.06554: variable '__network_is_ostree' from source: set_fact 10215 1727204038.06565: variable 'omit' from source: magic vars 10215 1727204038.06603: variable 'omit' from source: magic vars 10215 1727204038.06638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204038.06675: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204038.06701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204038.06731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204038.06748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204038.06785: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204038.06797: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204038.06806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204038.06935: Set connection var ansible_connection to ssh 10215 1727204038.06948: Set connection var ansible_pipelining to False 10215 1727204038.06960: Set connection var ansible_shell_type to sh 10215 1727204038.06971: Set connection var ansible_timeout to 10 10215 1727204038.06982: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204038.06999: Set connection var ansible_shell_executable to /bin/sh 10215 1727204038.07031: variable 'ansible_shell_executable' from source: unknown 10215 1727204038.07042: variable 'ansible_connection' from source: unknown 10215 1727204038.07050: variable 'ansible_module_compression' from source: unknown 10215 1727204038.07057: variable 'ansible_shell_type' from source: unknown 10215 1727204038.07194: variable 'ansible_shell_executable' from source: unknown 10215 1727204038.07197: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204038.07200: variable 'ansible_pipelining' from source: unknown 10215 1727204038.07202: variable 'ansible_timeout' from source: unknown 10215 1727204038.07204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204038.07224: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204038.07243: variable 'omit' from source: magic vars 10215 1727204038.07254: starting attempt loop 10215 1727204038.07263: running the handler 10215 1727204038.07276: variable 'ansible_facts' from source: unknown 10215 1727204038.07285: variable 'ansible_facts' from source: unknown 10215 1727204038.07334: _low_level_execute_command(): starting 10215 1727204038.07350: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204038.08059: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204038.08077: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204038.08095: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204038.08123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204038.08138: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204038.08146: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204038.08157: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204038.08212: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204038.08269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204038.08288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204038.08319: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204038.08398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204038.10125: stdout chunk (state=3): >>>/root <<< 10215 1727204038.10314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204038.10346: stderr chunk (state=3): >>><<< 10215 1727204038.10361: stdout chunk (state=3): >>><<< 10215 1727204038.10397: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204038.10425: _low_level_execute_command(): starting 10215 1727204038.10437: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094 `" && echo ansible-tmp-1727204038.1040802-10788-252312788548094="` echo /root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094 `" ) && sleep 0' 10215 1727204038.11172: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204038.11296: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204038.11327: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204038.11401: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204038.13375: stdout chunk (state=3): >>>ansible-tmp-1727204038.1040802-10788-252312788548094=/root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094 <<< 10215 1727204038.13616: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204038.13710: stderr chunk (state=3): >>><<< 10215 1727204038.13721: stdout chunk (state=3): >>><<< 10215 1727204038.13747: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204038.1040802-10788-252312788548094=/root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204038.13895: variable 'ansible_module_compression' from source: unknown 10215 1727204038.13898: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 10215 1727204038.13928: variable 'ansible_facts' from source: unknown 10215 1727204038.14052: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094/AnsiballZ_dnf.py 10215 1727204038.14236: Sending initial data 10215 1727204038.14338: Sent initial data (152 bytes) 10215 1727204038.14993: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204038.15046: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204038.15065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204038.15103: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204038.15171: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204038.16896: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204038.17073: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpffm4nlk0 /root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094/AnsiballZ_dnf.py <<< 10215 1727204038.17077: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094/AnsiballZ_dnf.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpffm4nlk0" to remote "/root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094/AnsiballZ_dnf.py" <<< 10215 1727204038.19916: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204038.19920: stderr chunk (state=3): >>><<< 10215 1727204038.19923: stdout chunk (state=3): >>><<< 10215 1727204038.19925: done transferring module to remote 10215 1727204038.19927: _low_level_execute_command(): starting 10215 1727204038.19930: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094/ /root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094/AnsiballZ_dnf.py && sleep 0' 10215 1727204038.20680: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204038.20699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204038.20715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204038.20787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204038.20826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204038.20869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204038.20950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204038.20975: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204038.21012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204038.21033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204038.23015: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204038.23019: stdout chunk (state=3): >>><<< 10215 1727204038.23021: stderr chunk (state=3): >>><<< 10215 1727204038.23024: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204038.23027: _low_level_execute_command(): starting 10215 1727204038.23029: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094/AnsiballZ_dnf.py && sleep 0' 10215 1727204038.23707: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204038.23756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204039.70728: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 10215 1727204039.75738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204039.75742: stdout chunk (state=3): >>><<< 10215 1727204039.75744: stderr chunk (state=3): >>><<< 10215 1727204039.75896: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204039.75905: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204039.75910: _low_level_execute_command(): starting 10215 1727204039.75913: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204038.1040802-10788-252312788548094/ > /dev/null 2>&1 && sleep 0' 10215 1727204039.76511: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204039.76528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204039.76541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204039.76561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204039.76581: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204039.76597: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204039.76615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204039.76634: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10215 1727204039.76648: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 10215 1727204039.76712: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204039.76763: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204039.76782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204039.76812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204039.76879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204039.78926: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204039.78955: stderr chunk (state=3): >>><<< 10215 1727204039.78968: stdout chunk (state=3): >>><<< 10215 1727204039.79000: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204039.79020: handler run complete 10215 1727204039.79099: attempt loop complete, returning result 10215 1727204039.79112: _execute() done 10215 1727204039.79121: dumping result to json 10215 1727204039.79132: done dumping result, returning 10215 1727204039.79160: done running TaskExecutor() for managed-node3/TASK: Install pgrep, sysctl [12b410aa-8751-3c74-8f8e-000000000011] 10215 1727204039.79176: sending task result for task 12b410aa-8751-3c74-8f8e-000000000011 10215 1727204039.79541: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000011 10215 1727204039.79544: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 10215 1727204039.79666: no more pending results, returning what we have 10215 1727204039.79670: results queue empty 10215 1727204039.79672: checking for any_errors_fatal 10215 1727204039.79679: done checking for any_errors_fatal 10215 1727204039.79680: checking for max_fail_percentage 10215 1727204039.79682: done checking for max_fail_percentage 10215 1727204039.79683: checking to see if all hosts have failed and the running result is not ok 10215 1727204039.79684: done checking to see if all hosts have failed 10215 1727204039.79685: getting the remaining hosts for this loop 10215 1727204039.79687: done getting the remaining hosts for this loop 10215 1727204039.79695: getting the next task for host managed-node3 10215 1727204039.79702: done getting next task for host managed-node3 10215 1727204039.79705: ^ task is: TASK: Create test interfaces 10215 1727204039.79711: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204039.79715: getting variables 10215 1727204039.79717: in VariableManager get_vars() 10215 1727204039.79762: Calling all_inventory to load vars for managed-node3 10215 1727204039.79886: Calling groups_inventory to load vars for managed-node3 10215 1727204039.79892: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204039.79904: Calling all_plugins_play to load vars for managed-node3 10215 1727204039.79910: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204039.79914: Calling groups_plugins_play to load vars for managed-node3 10215 1727204039.80282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204039.80617: done with get_vars() 10215 1727204039.80630: done getting variables 10215 1727204039.80759: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:53:59 -0400 (0:00:01.791) 0:00:08.375 ***** 10215 1727204039.80795: entering _queue_task() for managed-node3/shell 10215 1727204039.80798: Creating lock for shell 10215 1727204039.81198: worker is 1 (out of 1 available) 10215 1727204039.81214: exiting _queue_task() for managed-node3/shell 10215 1727204039.81226: done queuing things up, now waiting for results queue to drain 10215 1727204039.81227: waiting for pending results... 10215 1727204039.81534: running TaskExecutor() for managed-node3/TASK: Create test interfaces 10215 1727204039.81635: in run() - task 12b410aa-8751-3c74-8f8e-000000000012 10215 1727204039.81661: variable 'ansible_search_path' from source: unknown 10215 1727204039.81670: variable 'ansible_search_path' from source: unknown 10215 1727204039.81740: calling self._execute() 10215 1727204039.81837: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204039.81870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204039.81881: variable 'omit' from source: magic vars 10215 1727204039.82373: variable 'ansible_distribution_major_version' from source: facts 10215 1727204039.82416: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204039.82421: variable 'omit' from source: magic vars 10215 1727204039.82509: variable 'omit' from source: magic vars 10215 1727204039.83153: variable 'dhcp_interface1' from source: play vars 10215 1727204039.83158: variable 'dhcp_interface2' from source: play vars 10215 1727204039.83178: variable 'omit' from source: magic vars 10215 1727204039.83236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204039.83298: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204039.83371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204039.83376: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204039.83388: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204039.83440: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204039.83450: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204039.83480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204039.83620: Set connection var ansible_connection to ssh 10215 1727204039.83791: Set connection var ansible_pipelining to False 10215 1727204039.83797: Set connection var ansible_shell_type to sh 10215 1727204039.83799: Set connection var ansible_timeout to 10 10215 1727204039.83801: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204039.83804: Set connection var ansible_shell_executable to /bin/sh 10215 1727204039.83806: variable 'ansible_shell_executable' from source: unknown 10215 1727204039.83811: variable 'ansible_connection' from source: unknown 10215 1727204039.83813: variable 'ansible_module_compression' from source: unknown 10215 1727204039.83815: variable 'ansible_shell_type' from source: unknown 10215 1727204039.83817: variable 'ansible_shell_executable' from source: unknown 10215 1727204039.83819: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204039.83821: variable 'ansible_pipelining' from source: unknown 10215 1727204039.83823: variable 'ansible_timeout' from source: unknown 10215 1727204039.83825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204039.83963: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204039.83981: variable 'omit' from source: magic vars 10215 1727204039.84059: starting attempt loop 10215 1727204039.84065: running the handler 10215 1727204039.84068: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204039.84071: _low_level_execute_command(): starting 10215 1727204039.84074: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204039.85049: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204039.85066: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204039.85121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204039.85164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204039.86882: stdout chunk (state=3): >>>/root <<< 10215 1727204039.87203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204039.87206: stdout chunk (state=3): >>><<< 10215 1727204039.87212: stderr chunk (state=3): >>><<< 10215 1727204039.87216: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204039.87218: _low_level_execute_command(): starting 10215 1727204039.87221: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160 `" && echo ansible-tmp-1727204039.87102-10844-703378488160="` echo /root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160 `" ) && sleep 0' 10215 1727204039.87883: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204039.87903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204039.87924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204039.87980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204039.88100: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204039.88121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204039.88205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204039.90184: stdout chunk (state=3): >>>ansible-tmp-1727204039.87102-10844-703378488160=/root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160 <<< 10215 1727204039.90388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204039.90413: stderr chunk (state=3): >>><<< 10215 1727204039.90427: stdout chunk (state=3): >>><<< 10215 1727204039.90454: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204039.87102-10844-703378488160=/root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204039.90498: variable 'ansible_module_compression' from source: unknown 10215 1727204039.90579: ANSIBALLZ: Using generic lock for ansible.legacy.command 10215 1727204039.90614: ANSIBALLZ: Acquiring lock 10215 1727204039.90618: ANSIBALLZ: Lock acquired: 139878728192448 10215 1727204039.90620: ANSIBALLZ: Creating module 10215 1727204040.04773: ANSIBALLZ: Writing module into payload 10215 1727204040.04857: ANSIBALLZ: Writing module 10215 1727204040.04876: ANSIBALLZ: Renaming module 10215 1727204040.04883: ANSIBALLZ: Done creating module 10215 1727204040.04902: variable 'ansible_facts' from source: unknown 10215 1727204040.04958: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160/AnsiballZ_command.py 10215 1727204040.05082: Sending initial data 10215 1727204040.05086: Sent initial data (151 bytes) 10215 1727204040.05548: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204040.05594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204040.05597: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204040.05601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204040.05603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204040.05642: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204040.05650: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204040.05721: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204040.07447: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204040.07794: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204040.07832: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpx9m6ho49 /root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160/AnsiballZ_command.py <<< 10215 1727204040.07836: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160/AnsiballZ_command.py" <<< 10215 1727204040.07864: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpx9m6ho49" to remote "/root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160/AnsiballZ_command.py" <<< 10215 1727204040.08961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204040.08973: stdout chunk (state=3): >>><<< 10215 1727204040.08986: stderr chunk (state=3): >>><<< 10215 1727204040.09020: done transferring module to remote 10215 1727204040.09039: _low_level_execute_command(): starting 10215 1727204040.09051: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160/ /root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160/AnsiballZ_command.py && sleep 0' 10215 1727204040.09687: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204040.09704: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204040.09721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204040.09744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204040.09758: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204040.09766: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204040.09776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204040.09792: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10215 1727204040.09887: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10215 1727204040.09910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204040.09921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204040.09988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204040.12097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204040.12109: stdout chunk (state=3): >>><<< 10215 1727204040.12112: stderr chunk (state=3): >>><<< 10215 1727204040.12332: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204040.12340: _low_level_execute_command(): starting 10215 1727204040.12343: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160/AnsiballZ_command.py && sleep 0' 10215 1727204040.12946: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204040.12985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204040.12996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204040.13001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204040.13004: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204040.13006: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204040.13009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204040.13092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10215 1727204040.13100: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 10215 1727204040.13103: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10215 1727204040.13106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204040.13108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204040.13110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204040.13112: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204040.13114: stderr chunk (state=3): >>>debug2: match found <<< 10215 1727204040.13116: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204040.13268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204040.13272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204040.13275: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204040.13294: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204041.57504: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 647 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 647 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! <<< 10215 1727204041.57532: stdout chunk (state=3): >>>firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:00.303998", "end": "2024-09-24 14:54:01.571339", "delta": "0:00:01.267341", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10215 1727204041.59217: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204041.59227: stdout chunk (state=3): >>><<< 10215 1727204041.59240: stderr chunk (state=3): >>><<< 10215 1727204041.59273: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 647 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 647 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:54:00.303998", "end": "2024-09-24 14:54:01.571339", "delta": "0:00:01.267341", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204041.59480: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204041.59500: _low_level_execute_command(): starting 10215 1727204041.59515: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204039.87102-10844-703378488160/ > /dev/null 2>&1 && sleep 0' 10215 1727204041.60179: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204041.60201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204041.60220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204041.60314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204041.60355: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204041.60382: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204041.60403: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204041.60484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204041.62523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204041.62642: stderr chunk (state=3): >>><<< 10215 1727204041.62645: stdout chunk (state=3): >>><<< 10215 1727204041.62673: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204041.62682: handler run complete 10215 1727204041.62731: Evaluated conditional (False): False 10215 1727204041.62747: attempt loop complete, returning result 10215 1727204041.62750: _execute() done 10215 1727204041.62755: dumping result to json 10215 1727204041.62763: done dumping result, returning 10215 1727204041.62774: done running TaskExecutor() for managed-node3/TASK: Create test interfaces [12b410aa-8751-3c74-8f8e-000000000012] 10215 1727204041.62781: sending task result for task 12b410aa-8751-3c74-8f8e-000000000012 10215 1727204041.62923: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000012 10215 1727204041.62925: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.267341", "end": "2024-09-24 14:54:01.571339", "rc": 0, "start": "2024-09-24 14:54:00.303998" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 647 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 647 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 10215 1727204041.63105: no more pending results, returning what we have 10215 1727204041.63110: results queue empty 10215 1727204041.63111: checking for any_errors_fatal 10215 1727204041.63119: done checking for any_errors_fatal 10215 1727204041.63120: checking for max_fail_percentage 10215 1727204041.63122: done checking for max_fail_percentage 10215 1727204041.63123: checking to see if all hosts have failed and the running result is not ok 10215 1727204041.63124: done checking to see if all hosts have failed 10215 1727204041.63125: getting the remaining hosts for this loop 10215 1727204041.63127: done getting the remaining hosts for this loop 10215 1727204041.63132: getting the next task for host managed-node3 10215 1727204041.63142: done getting next task for host managed-node3 10215 1727204041.63145: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10215 1727204041.63149: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204041.63399: getting variables 10215 1727204041.63401: in VariableManager get_vars() 10215 1727204041.63442: Calling all_inventory to load vars for managed-node3 10215 1727204041.63445: Calling groups_inventory to load vars for managed-node3 10215 1727204041.63448: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204041.63460: Calling all_plugins_play to load vars for managed-node3 10215 1727204041.63463: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204041.63467: Calling groups_plugins_play to load vars for managed-node3 10215 1727204041.63782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204041.64407: done with get_vars() 10215 1727204041.64421: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:01 -0400 (0:00:01.837) 0:00:10.212 ***** 10215 1727204041.64537: entering _queue_task() for managed-node3/include_tasks 10215 1727204041.64864: worker is 1 (out of 1 available) 10215 1727204041.64877: exiting _queue_task() for managed-node3/include_tasks 10215 1727204041.64891: done queuing things up, now waiting for results queue to drain 10215 1727204041.64893: waiting for pending results... 10215 1727204041.65208: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 10215 1727204041.65364: in run() - task 12b410aa-8751-3c74-8f8e-000000000016 10215 1727204041.65369: variable 'ansible_search_path' from source: unknown 10215 1727204041.65373: variable 'ansible_search_path' from source: unknown 10215 1727204041.65396: calling self._execute() 10215 1727204041.65499: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204041.65513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204041.65580: variable 'omit' from source: magic vars 10215 1727204041.66080: variable 'ansible_distribution_major_version' from source: facts 10215 1727204041.66103: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204041.66115: _execute() done 10215 1727204041.66133: dumping result to json 10215 1727204041.66141: done dumping result, returning 10215 1727204041.66154: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-3c74-8f8e-000000000016] 10215 1727204041.66164: sending task result for task 12b410aa-8751-3c74-8f8e-000000000016 10215 1727204041.66435: no more pending results, returning what we have 10215 1727204041.66442: in VariableManager get_vars() 10215 1727204041.66502: Calling all_inventory to load vars for managed-node3 10215 1727204041.66507: Calling groups_inventory to load vars for managed-node3 10215 1727204041.66509: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204041.66528: Calling all_plugins_play to load vars for managed-node3 10215 1727204041.66532: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204041.66536: Calling groups_plugins_play to load vars for managed-node3 10215 1727204041.66977: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000016 10215 1727204041.66980: WORKER PROCESS EXITING 10215 1727204041.67011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204041.67269: done with get_vars() 10215 1727204041.67279: variable 'ansible_search_path' from source: unknown 10215 1727204041.67280: variable 'ansible_search_path' from source: unknown 10215 1727204041.67328: we have included files to process 10215 1727204041.67330: generating all_blocks data 10215 1727204041.67331: done generating all_blocks data 10215 1727204041.67332: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10215 1727204041.67333: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10215 1727204041.67336: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10215 1727204041.67639: done processing included file 10215 1727204041.67642: iterating over new_blocks loaded from include file 10215 1727204041.67644: in VariableManager get_vars() 10215 1727204041.67667: done with get_vars() 10215 1727204041.67669: filtering new block on tags 10215 1727204041.67694: done filtering new block on tags 10215 1727204041.67697: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 10215 1727204041.67702: extending task lists for all hosts with included blocks 10215 1727204041.67826: done extending task lists 10215 1727204041.67828: done processing included files 10215 1727204041.67829: results queue empty 10215 1727204041.67830: checking for any_errors_fatal 10215 1727204041.67836: done checking for any_errors_fatal 10215 1727204041.67837: checking for max_fail_percentage 10215 1727204041.67838: done checking for max_fail_percentage 10215 1727204041.67839: checking to see if all hosts have failed and the running result is not ok 10215 1727204041.67840: done checking to see if all hosts have failed 10215 1727204041.67841: getting the remaining hosts for this loop 10215 1727204041.67842: done getting the remaining hosts for this loop 10215 1727204041.67845: getting the next task for host managed-node3 10215 1727204041.67849: done getting next task for host managed-node3 10215 1727204041.67851: ^ task is: TASK: Get stat for interface {{ interface }} 10215 1727204041.67854: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204041.67857: getting variables 10215 1727204041.67858: in VariableManager get_vars() 10215 1727204041.67873: Calling all_inventory to load vars for managed-node3 10215 1727204041.67876: Calling groups_inventory to load vars for managed-node3 10215 1727204041.67878: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204041.67885: Calling all_plugins_play to load vars for managed-node3 10215 1727204041.67887: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204041.67893: Calling groups_plugins_play to load vars for managed-node3 10215 1727204041.68076: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204041.68362: done with get_vars() 10215 1727204041.68375: done getting variables 10215 1727204041.68576: variable 'interface' from source: task vars 10215 1727204041.68581: variable 'dhcp_interface1' from source: play vars 10215 1727204041.68656: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.041) 0:00:10.254 ***** 10215 1727204041.68711: entering _queue_task() for managed-node3/stat 10215 1727204041.69064: worker is 1 (out of 1 available) 10215 1727204041.69081: exiting _queue_task() for managed-node3/stat 10215 1727204041.69097: done queuing things up, now waiting for results queue to drain 10215 1727204041.69099: waiting for pending results... 10215 1727204041.69408: running TaskExecutor() for managed-node3/TASK: Get stat for interface test1 10215 1727204041.69588: in run() - task 12b410aa-8751-3c74-8f8e-000000000152 10215 1727204041.69616: variable 'ansible_search_path' from source: unknown 10215 1727204041.69626: variable 'ansible_search_path' from source: unknown 10215 1727204041.69684: calling self._execute() 10215 1727204041.69791: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204041.69806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204041.69877: variable 'omit' from source: magic vars 10215 1727204041.70282: variable 'ansible_distribution_major_version' from source: facts 10215 1727204041.70305: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204041.70325: variable 'omit' from source: magic vars 10215 1727204041.70400: variable 'omit' from source: magic vars 10215 1727204041.70537: variable 'interface' from source: task vars 10215 1727204041.70548: variable 'dhcp_interface1' from source: play vars 10215 1727204041.70632: variable 'dhcp_interface1' from source: play vars 10215 1727204041.70694: variable 'omit' from source: magic vars 10215 1727204041.70720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204041.70774: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204041.70805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204041.70832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204041.70862: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204041.70971: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204041.70975: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204041.70978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204041.71048: Set connection var ansible_connection to ssh 10215 1727204041.71062: Set connection var ansible_pipelining to False 10215 1727204041.71080: Set connection var ansible_shell_type to sh 10215 1727204041.71095: Set connection var ansible_timeout to 10 10215 1727204041.71107: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204041.71122: Set connection var ansible_shell_executable to /bin/sh 10215 1727204041.71153: variable 'ansible_shell_executable' from source: unknown 10215 1727204041.71161: variable 'ansible_connection' from source: unknown 10215 1727204041.71168: variable 'ansible_module_compression' from source: unknown 10215 1727204041.71175: variable 'ansible_shell_type' from source: unknown 10215 1727204041.71188: variable 'ansible_shell_executable' from source: unknown 10215 1727204041.71198: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204041.71207: variable 'ansible_pipelining' from source: unknown 10215 1727204041.71215: variable 'ansible_timeout' from source: unknown 10215 1727204041.71223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204041.71516: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204041.71520: variable 'omit' from source: magic vars 10215 1727204041.71523: starting attempt loop 10215 1727204041.71525: running the handler 10215 1727204041.71528: _low_level_execute_command(): starting 10215 1727204041.71530: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204041.72427: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204041.72462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204041.72535: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204041.74598: stdout chunk (state=3): >>>/root <<< 10215 1727204041.74861: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204041.74866: stdout chunk (state=3): >>><<< 10215 1727204041.74869: stderr chunk (state=3): >>><<< 10215 1727204041.74872: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204041.74876: _low_level_execute_command(): starting 10215 1727204041.74879: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651 `" && echo ansible-tmp-1727204041.7475057-10949-261176150151651="` echo /root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651 `" ) && sleep 0' 10215 1727204041.75844: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10215 1727204041.75998: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204041.76016: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204041.76098: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204041.78162: stdout chunk (state=3): >>>ansible-tmp-1727204041.7475057-10949-261176150151651=/root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651 <<< 10215 1727204041.78610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204041.78639: stdout chunk (state=3): >>><<< 10215 1727204041.78643: stderr chunk (state=3): >>><<< 10215 1727204041.78665: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204041.7475057-10949-261176150151651=/root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204041.78935: variable 'ansible_module_compression' from source: unknown 10215 1727204041.78996: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10215 1727204041.79196: variable 'ansible_facts' from source: unknown 10215 1727204041.79540: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651/AnsiballZ_stat.py 10215 1727204041.80384: Sending initial data 10215 1727204041.80388: Sent initial data (153 bytes) 10215 1727204041.81516: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204041.81752: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204041.81840: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204041.81913: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204041.83646: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204041.83671: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204041.83715: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpjz1dtkij /root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651/AnsiballZ_stat.py <<< 10215 1727204041.83803: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651/AnsiballZ_stat.py" <<< 10215 1727204041.83813: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpjz1dtkij" to remote "/root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651/AnsiballZ_stat.py" <<< 10215 1727204041.85346: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204041.85350: stdout chunk (state=3): >>><<< 10215 1727204041.85365: stderr chunk (state=3): >>><<< 10215 1727204041.85393: done transferring module to remote 10215 1727204041.85406: _low_level_execute_command(): starting 10215 1727204041.85416: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651/ /root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651/AnsiballZ_stat.py && sleep 0' 10215 1727204041.86524: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204041.86537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204041.86562: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 10215 1727204041.86569: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204041.86651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204041.86672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204041.86688: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204041.86784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204041.88797: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204041.88801: stderr chunk (state=3): >>><<< 10215 1727204041.88804: stdout chunk (state=3): >>><<< 10215 1727204041.88807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204041.88814: _low_level_execute_command(): starting 10215 1727204041.88819: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651/AnsiballZ_stat.py && sleep 0' 10215 1727204041.89713: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204041.89718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204041.89895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204041.89917: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204041.89932: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204041.89951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204041.90040: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204042.07509: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34869, "dev": 23, "nlink": 1, "atime": 1727204040.3114147, "mtime": 1727204040.3114147, "ctime": 1727204040.3114147, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10215 1727204042.09047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204042.09208: stderr chunk (state=3): >>><<< 10215 1727204042.09212: stdout chunk (state=3): >>><<< 10215 1727204042.09240: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34869, "dev": 23, "nlink": 1, "atime": 1727204040.3114147, "mtime": 1727204040.3114147, "ctime": 1727204040.3114147, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204042.09550: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204042.09554: _low_level_execute_command(): starting 10215 1727204042.09557: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204041.7475057-10949-261176150151651/ > /dev/null 2>&1 && sleep 0' 10215 1727204042.10816: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204042.10843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204042.11027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204042.11091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204042.13122: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204042.13329: stderr chunk (state=3): >>><<< 10215 1727204042.13333: stdout chunk (state=3): >>><<< 10215 1727204042.13452: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204042.13461: handler run complete 10215 1727204042.13463: attempt loop complete, returning result 10215 1727204042.13466: _execute() done 10215 1727204042.13468: dumping result to json 10215 1727204042.13470: done dumping result, returning 10215 1727204042.13472: done running TaskExecutor() for managed-node3/TASK: Get stat for interface test1 [12b410aa-8751-3c74-8f8e-000000000152] 10215 1727204042.13474: sending task result for task 12b410aa-8751-3c74-8f8e-000000000152 10215 1727204042.13637: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000152 10215 1727204042.13641: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204040.3114147, "block_size": 4096, "blocks": 0, "ctime": 1727204040.3114147, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 34869, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727204040.3114147, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10215 1727204042.13795: no more pending results, returning what we have 10215 1727204042.13800: results queue empty 10215 1727204042.13801: checking for any_errors_fatal 10215 1727204042.13804: done checking for any_errors_fatal 10215 1727204042.13805: checking for max_fail_percentage 10215 1727204042.13806: done checking for max_fail_percentage 10215 1727204042.13810: checking to see if all hosts have failed and the running result is not ok 10215 1727204042.13812: done checking to see if all hosts have failed 10215 1727204042.13813: getting the remaining hosts for this loop 10215 1727204042.13815: done getting the remaining hosts for this loop 10215 1727204042.13820: getting the next task for host managed-node3 10215 1727204042.13830: done getting next task for host managed-node3 10215 1727204042.13833: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10215 1727204042.13837: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204042.13842: getting variables 10215 1727204042.13844: in VariableManager get_vars() 10215 1727204042.14262: Calling all_inventory to load vars for managed-node3 10215 1727204042.14267: Calling groups_inventory to load vars for managed-node3 10215 1727204042.14271: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204042.14284: Calling all_plugins_play to load vars for managed-node3 10215 1727204042.14288: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204042.14294: Calling groups_plugins_play to load vars for managed-node3 10215 1727204042.14982: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204042.15639: done with get_vars() 10215 1727204042.15655: done getting variables 10215 1727204042.15974: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 10215 1727204042.16480: variable 'interface' from source: task vars 10215 1727204042.16485: variable 'dhcp_interface1' from source: play vars 10215 1727204042.16632: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:02 -0400 (0:00:00.480) 0:00:10.735 ***** 10215 1727204042.16815: entering _queue_task() for managed-node3/assert 10215 1727204042.16817: Creating lock for assert 10215 1727204042.17593: worker is 1 (out of 1 available) 10215 1727204042.17608: exiting _queue_task() for managed-node3/assert 10215 1727204042.17621: done queuing things up, now waiting for results queue to drain 10215 1727204042.17623: waiting for pending results... 10215 1727204042.18442: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test1' 10215 1727204042.18551: in run() - task 12b410aa-8751-3c74-8f8e-000000000017 10215 1727204042.18557: variable 'ansible_search_path' from source: unknown 10215 1727204042.18564: variable 'ansible_search_path' from source: unknown 10215 1727204042.18782: calling self._execute() 10215 1727204042.18847: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.18902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.18921: variable 'omit' from source: magic vars 10215 1727204042.20123: variable 'ansible_distribution_major_version' from source: facts 10215 1727204042.20208: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204042.20223: variable 'omit' from source: magic vars 10215 1727204042.20479: variable 'omit' from source: magic vars 10215 1727204042.20703: variable 'interface' from source: task vars 10215 1727204042.20916: variable 'dhcp_interface1' from source: play vars 10215 1727204042.20922: variable 'dhcp_interface1' from source: play vars 10215 1727204042.20948: variable 'omit' from source: magic vars 10215 1727204042.21025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204042.21204: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204042.21234: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204042.21462: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204042.21467: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204042.21470: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204042.21472: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.21474: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.21897: Set connection var ansible_connection to ssh 10215 1727204042.21901: Set connection var ansible_pipelining to False 10215 1727204042.21903: Set connection var ansible_shell_type to sh 10215 1727204042.21906: Set connection var ansible_timeout to 10 10215 1727204042.21908: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204042.21925: Set connection var ansible_shell_executable to /bin/sh 10215 1727204042.21964: variable 'ansible_shell_executable' from source: unknown 10215 1727204042.22014: variable 'ansible_connection' from source: unknown 10215 1727204042.22033: variable 'ansible_module_compression' from source: unknown 10215 1727204042.22042: variable 'ansible_shell_type' from source: unknown 10215 1727204042.22051: variable 'ansible_shell_executable' from source: unknown 10215 1727204042.22071: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.22123: variable 'ansible_pipelining' from source: unknown 10215 1727204042.22149: variable 'ansible_timeout' from source: unknown 10215 1727204042.22161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.22692: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204042.22698: variable 'omit' from source: magic vars 10215 1727204042.22701: starting attempt loop 10215 1727204042.22704: running the handler 10215 1727204042.23138: variable 'interface_stat' from source: set_fact 10215 1727204042.23171: Evaluated conditional (interface_stat.stat.exists): True 10215 1727204042.23241: handler run complete 10215 1727204042.23267: attempt loop complete, returning result 10215 1727204042.23275: _execute() done 10215 1727204042.23284: dumping result to json 10215 1727204042.23301: done dumping result, returning 10215 1727204042.23392: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test1' [12b410aa-8751-3c74-8f8e-000000000017] 10215 1727204042.23406: sending task result for task 12b410aa-8751-3c74-8f8e-000000000017 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10215 1727204042.23747: no more pending results, returning what we have 10215 1727204042.23751: results queue empty 10215 1727204042.23753: checking for any_errors_fatal 10215 1727204042.23763: done checking for any_errors_fatal 10215 1727204042.23765: checking for max_fail_percentage 10215 1727204042.23766: done checking for max_fail_percentage 10215 1727204042.23768: checking to see if all hosts have failed and the running result is not ok 10215 1727204042.23769: done checking to see if all hosts have failed 10215 1727204042.23770: getting the remaining hosts for this loop 10215 1727204042.23772: done getting the remaining hosts for this loop 10215 1727204042.23904: getting the next task for host managed-node3 10215 1727204042.23915: done getting next task for host managed-node3 10215 1727204042.23919: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10215 1727204042.23922: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204042.23926: getting variables 10215 1727204042.23928: in VariableManager get_vars() 10215 1727204042.23980: Calling all_inventory to load vars for managed-node3 10215 1727204042.23984: Calling groups_inventory to load vars for managed-node3 10215 1727204042.24361: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204042.24374: Calling all_plugins_play to load vars for managed-node3 10215 1727204042.24378: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204042.24383: Calling groups_plugins_play to load vars for managed-node3 10215 1727204042.24905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204042.25418: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000017 10215 1727204042.25422: WORKER PROCESS EXITING 10215 1727204042.25469: done with get_vars() 10215 1727204042.25485: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:02 -0400 (0:00:00.088) 0:00:10.824 ***** 10215 1727204042.25698: entering _queue_task() for managed-node3/include_tasks 10215 1727204042.26279: worker is 1 (out of 1 available) 10215 1727204042.26437: exiting _queue_task() for managed-node3/include_tasks 10215 1727204042.26450: done queuing things up, now waiting for results queue to drain 10215 1727204042.26452: waiting for pending results... 10215 1727204042.26626: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 10215 1727204042.26850: in run() - task 12b410aa-8751-3c74-8f8e-00000000001b 10215 1727204042.26885: variable 'ansible_search_path' from source: unknown 10215 1727204042.26898: variable 'ansible_search_path' from source: unknown 10215 1727204042.26947: calling self._execute() 10215 1727204042.27051: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.27066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.27095: variable 'omit' from source: magic vars 10215 1727204042.28076: variable 'ansible_distribution_major_version' from source: facts 10215 1727204042.28080: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204042.28083: _execute() done 10215 1727204042.28086: dumping result to json 10215 1727204042.28091: done dumping result, returning 10215 1727204042.28094: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-3c74-8f8e-00000000001b] 10215 1727204042.28097: sending task result for task 12b410aa-8751-3c74-8f8e-00000000001b 10215 1727204042.28234: no more pending results, returning what we have 10215 1727204042.28241: in VariableManager get_vars() 10215 1727204042.28416: Calling all_inventory to load vars for managed-node3 10215 1727204042.28421: Calling groups_inventory to load vars for managed-node3 10215 1727204042.28424: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204042.28443: Calling all_plugins_play to load vars for managed-node3 10215 1727204042.28447: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204042.28452: Calling groups_plugins_play to load vars for managed-node3 10215 1727204042.29182: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000001b 10215 1727204042.29186: WORKER PROCESS EXITING 10215 1727204042.29219: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204042.30164: done with get_vars() 10215 1727204042.30177: variable 'ansible_search_path' from source: unknown 10215 1727204042.30179: variable 'ansible_search_path' from source: unknown 10215 1727204042.30360: we have included files to process 10215 1727204042.30362: generating all_blocks data 10215 1727204042.30364: done generating all_blocks data 10215 1727204042.30369: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10215 1727204042.30371: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10215 1727204042.30380: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10215 1727204042.30719: done processing included file 10215 1727204042.30721: iterating over new_blocks loaded from include file 10215 1727204042.30723: in VariableManager get_vars() 10215 1727204042.30751: done with get_vars() 10215 1727204042.30753: filtering new block on tags 10215 1727204042.30774: done filtering new block on tags 10215 1727204042.30777: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 10215 1727204042.30784: extending task lists for all hosts with included blocks 10215 1727204042.31041: done extending task lists 10215 1727204042.31043: done processing included files 10215 1727204042.31044: results queue empty 10215 1727204042.31045: checking for any_errors_fatal 10215 1727204042.31050: done checking for any_errors_fatal 10215 1727204042.31051: checking for max_fail_percentage 10215 1727204042.31052: done checking for max_fail_percentage 10215 1727204042.31053: checking to see if all hosts have failed and the running result is not ok 10215 1727204042.31054: done checking to see if all hosts have failed 10215 1727204042.31055: getting the remaining hosts for this loop 10215 1727204042.31057: done getting the remaining hosts for this loop 10215 1727204042.31060: getting the next task for host managed-node3 10215 1727204042.31065: done getting next task for host managed-node3 10215 1727204042.31068: ^ task is: TASK: Get stat for interface {{ interface }} 10215 1727204042.31071: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204042.31074: getting variables 10215 1727204042.31075: in VariableManager get_vars() 10215 1727204042.31096: Calling all_inventory to load vars for managed-node3 10215 1727204042.31099: Calling groups_inventory to load vars for managed-node3 10215 1727204042.31102: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204042.31109: Calling all_plugins_play to load vars for managed-node3 10215 1727204042.31113: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204042.31116: Calling groups_plugins_play to load vars for managed-node3 10215 1727204042.31448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204042.31747: done with get_vars() 10215 1727204042.31761: done getting variables 10215 1727204042.32006: variable 'interface' from source: task vars 10215 1727204042.32126: variable 'dhcp_interface2' from source: play vars 10215 1727204042.32207: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:02 -0400 (0:00:00.065) 0:00:10.889 ***** 10215 1727204042.32250: entering _queue_task() for managed-node3/stat 10215 1727204042.32942: worker is 1 (out of 1 available) 10215 1727204042.32956: exiting _queue_task() for managed-node3/stat 10215 1727204042.32968: done queuing things up, now waiting for results queue to drain 10215 1727204042.32970: waiting for pending results... 10215 1727204042.33637: running TaskExecutor() for managed-node3/TASK: Get stat for interface test2 10215 1727204042.33908: in run() - task 12b410aa-8751-3c74-8f8e-00000000016a 10215 1727204042.33930: variable 'ansible_search_path' from source: unknown 10215 1727204042.33941: variable 'ansible_search_path' from source: unknown 10215 1727204042.34097: calling self._execute() 10215 1727204042.34250: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.34265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.34333: variable 'omit' from source: magic vars 10215 1727204042.35219: variable 'ansible_distribution_major_version' from source: facts 10215 1727204042.35241: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204042.35306: variable 'omit' from source: magic vars 10215 1727204042.35500: variable 'omit' from source: magic vars 10215 1727204042.35746: variable 'interface' from source: task vars 10215 1727204042.35758: variable 'dhcp_interface2' from source: play vars 10215 1727204042.36016: variable 'dhcp_interface2' from source: play vars 10215 1727204042.36019: variable 'omit' from source: magic vars 10215 1727204042.36116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204042.36201: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204042.36262: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204042.37218: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204042.37222: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204042.37259: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204042.37269: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.37331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.37540: Set connection var ansible_connection to ssh 10215 1727204042.37694: Set connection var ansible_pipelining to False 10215 1727204042.37697: Set connection var ansible_shell_type to sh 10215 1727204042.37700: Set connection var ansible_timeout to 10 10215 1727204042.37702: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204042.37799: Set connection var ansible_shell_executable to /bin/sh 10215 1727204042.37815: variable 'ansible_shell_executable' from source: unknown 10215 1727204042.37824: variable 'ansible_connection' from source: unknown 10215 1727204042.37983: variable 'ansible_module_compression' from source: unknown 10215 1727204042.37987: variable 'ansible_shell_type' from source: unknown 10215 1727204042.37991: variable 'ansible_shell_executable' from source: unknown 10215 1727204042.37994: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.37996: variable 'ansible_pipelining' from source: unknown 10215 1727204042.37999: variable 'ansible_timeout' from source: unknown 10215 1727204042.38001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.38398: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204042.38523: variable 'omit' from source: magic vars 10215 1727204042.38534: starting attempt loop 10215 1727204042.38541: running the handler 10215 1727204042.38562: _low_level_execute_command(): starting 10215 1727204042.38575: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204042.40114: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204042.40215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204042.40379: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204042.40411: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204042.40485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204042.42240: stdout chunk (state=3): >>>/root <<< 10215 1727204042.42347: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204042.42527: stderr chunk (state=3): >>><<< 10215 1727204042.42537: stdout chunk (state=3): >>><<< 10215 1727204042.42614: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204042.42640: _low_level_execute_command(): starting 10215 1727204042.42652: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574 `" && echo ansible-tmp-1727204042.4262576-11074-145466446266574="` echo /root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574 `" ) && sleep 0' 10215 1727204042.44038: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204042.44145: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204042.44183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204042.44225: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204042.44243: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204042.44414: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204042.46420: stdout chunk (state=3): >>>ansible-tmp-1727204042.4262576-11074-145466446266574=/root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574 <<< 10215 1727204042.46614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204042.46654: stderr chunk (state=3): >>><<< 10215 1727204042.46670: stdout chunk (state=3): >>><<< 10215 1727204042.46697: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204042.4262576-11074-145466446266574=/root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204042.46771: variable 'ansible_module_compression' from source: unknown 10215 1727204042.46851: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10215 1727204042.46895: variable 'ansible_facts' from source: unknown 10215 1727204042.47012: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574/AnsiballZ_stat.py 10215 1727204042.47196: Sending initial data 10215 1727204042.47309: Sent initial data (153 bytes) 10215 1727204042.47914: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204042.47945: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204042.48015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204042.48093: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204042.48118: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204042.48376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204042.49926: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204042.49986: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204042.50054: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmp4k6j_6b_ /root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574/AnsiballZ_stat.py <<< 10215 1727204042.50058: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574/AnsiballZ_stat.py" <<< 10215 1727204042.50117: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmp4k6j_6b_" to remote "/root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574/AnsiballZ_stat.py" <<< 10215 1727204042.52274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204042.52451: stderr chunk (state=3): >>><<< 10215 1727204042.52671: stdout chunk (state=3): >>><<< 10215 1727204042.52676: done transferring module to remote 10215 1727204042.52678: _low_level_execute_command(): starting 10215 1727204042.52680: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574/ /root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574/AnsiballZ_stat.py && sleep 0' 10215 1727204042.54126: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204042.54244: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204042.54272: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204042.54342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204042.54620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204042.56545: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204042.56550: stdout chunk (state=3): >>><<< 10215 1727204042.56557: stderr chunk (state=3): >>><<< 10215 1727204042.56815: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204042.56823: _low_level_execute_command(): starting 10215 1727204042.56833: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574/AnsiballZ_stat.py && sleep 0' 10215 1727204042.58042: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204042.58376: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10215 1727204042.58381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204042.58384: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204042.58420: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204042.75865: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35275, "dev": 23, "nlink": 1, "atime": 1727204040.318162, "mtime": 1727204040.318162, "ctime": 1727204040.318162, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10215 1727204042.77700: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204042.77705: stdout chunk (state=3): >>><<< 10215 1727204042.77710: stderr chunk (state=3): >>><<< 10215 1727204042.77714: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35275, "dev": 23, "nlink": 1, "atime": 1727204040.318162, "mtime": 1727204040.318162, "ctime": 1727204040.318162, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204042.77716: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204042.77719: _low_level_execute_command(): starting 10215 1727204042.77721: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204042.4262576-11074-145466446266574/ > /dev/null 2>&1 && sleep 0' 10215 1727204042.78943: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204042.78968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204042.78985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204042.79012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204042.79068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204042.79082: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204042.79182: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204042.79211: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204042.79234: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204042.79302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204042.81348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204042.81352: stdout chunk (state=3): >>><<< 10215 1727204042.81354: stderr chunk (state=3): >>><<< 10215 1727204042.81377: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204042.81494: handler run complete 10215 1727204042.81498: attempt loop complete, returning result 10215 1727204042.81500: _execute() done 10215 1727204042.81502: dumping result to json 10215 1727204042.81517: done dumping result, returning 10215 1727204042.81532: done running TaskExecutor() for managed-node3/TASK: Get stat for interface test2 [12b410aa-8751-3c74-8f8e-00000000016a] 10215 1727204042.81544: sending task result for task 12b410aa-8751-3c74-8f8e-00000000016a 10215 1727204042.81702: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000016a ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204040.318162, "block_size": 4096, "blocks": 0, "ctime": 1727204040.318162, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35275, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727204040.318162, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10215 1727204042.82012: no more pending results, returning what we have 10215 1727204042.82017: results queue empty 10215 1727204042.82018: checking for any_errors_fatal 10215 1727204042.82020: done checking for any_errors_fatal 10215 1727204042.82021: checking for max_fail_percentage 10215 1727204042.82024: done checking for max_fail_percentage 10215 1727204042.82025: checking to see if all hosts have failed and the running result is not ok 10215 1727204042.82026: done checking to see if all hosts have failed 10215 1727204042.82027: getting the remaining hosts for this loop 10215 1727204042.82029: done getting the remaining hosts for this loop 10215 1727204042.82035: getting the next task for host managed-node3 10215 1727204042.82045: done getting next task for host managed-node3 10215 1727204042.82165: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10215 1727204042.82170: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204042.82175: getting variables 10215 1727204042.82176: in VariableManager get_vars() 10215 1727204042.82221: Calling all_inventory to load vars for managed-node3 10215 1727204042.82224: Calling groups_inventory to load vars for managed-node3 10215 1727204042.82227: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204042.82235: WORKER PROCESS EXITING 10215 1727204042.82247: Calling all_plugins_play to load vars for managed-node3 10215 1727204042.82251: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204042.82255: Calling groups_plugins_play to load vars for managed-node3 10215 1727204042.82891: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204042.83235: done with get_vars() 10215 1727204042.83253: done getting variables 10215 1727204042.83332: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204042.83516: variable 'interface' from source: task vars 10215 1727204042.83521: variable 'dhcp_interface2' from source: play vars 10215 1727204042.83611: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:02 -0400 (0:00:00.513) 0:00:11.403 ***** 10215 1727204042.83649: entering _queue_task() for managed-node3/assert 10215 1727204042.84058: worker is 1 (out of 1 available) 10215 1727204042.84074: exiting _queue_task() for managed-node3/assert 10215 1727204042.84088: done queuing things up, now waiting for results queue to drain 10215 1727204042.84199: waiting for pending results... 10215 1727204042.84401: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test2' 10215 1727204042.84803: in run() - task 12b410aa-8751-3c74-8f8e-00000000001c 10215 1727204042.84810: variable 'ansible_search_path' from source: unknown 10215 1727204042.84813: variable 'ansible_search_path' from source: unknown 10215 1727204042.84934: calling self._execute() 10215 1727204042.85237: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.85352: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.85376: variable 'omit' from source: magic vars 10215 1727204042.86603: variable 'ansible_distribution_major_version' from source: facts 10215 1727204042.86609: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204042.86612: variable 'omit' from source: magic vars 10215 1727204042.86744: variable 'omit' from source: magic vars 10215 1727204042.87053: variable 'interface' from source: task vars 10215 1727204042.87067: variable 'dhcp_interface2' from source: play vars 10215 1727204042.87277: variable 'dhcp_interface2' from source: play vars 10215 1727204042.87395: variable 'omit' from source: magic vars 10215 1727204042.87441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204042.87726: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204042.87730: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204042.87733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204042.87815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204042.87920: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204042.87930: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.87940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.88347: Set connection var ansible_connection to ssh 10215 1727204042.88350: Set connection var ansible_pipelining to False 10215 1727204042.88353: Set connection var ansible_shell_type to sh 10215 1727204042.88356: Set connection var ansible_timeout to 10 10215 1727204042.88358: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204042.88406: Set connection var ansible_shell_executable to /bin/sh 10215 1727204042.88503: variable 'ansible_shell_executable' from source: unknown 10215 1727204042.88509: variable 'ansible_connection' from source: unknown 10215 1727204042.88512: variable 'ansible_module_compression' from source: unknown 10215 1727204042.88632: variable 'ansible_shell_type' from source: unknown 10215 1727204042.88635: variable 'ansible_shell_executable' from source: unknown 10215 1727204042.88638: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.88640: variable 'ansible_pipelining' from source: unknown 10215 1727204042.88643: variable 'ansible_timeout' from source: unknown 10215 1727204042.88645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.89072: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204042.89178: variable 'omit' from source: magic vars 10215 1727204042.89181: starting attempt loop 10215 1727204042.89184: running the handler 10215 1727204042.89535: variable 'interface_stat' from source: set_fact 10215 1727204042.89567: Evaluated conditional (interface_stat.stat.exists): True 10215 1727204042.89579: handler run complete 10215 1727204042.89605: attempt loop complete, returning result 10215 1727204042.89626: _execute() done 10215 1727204042.89638: dumping result to json 10215 1727204042.89647: done dumping result, returning 10215 1727204042.89694: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'test2' [12b410aa-8751-3c74-8f8e-00000000001c] 10215 1727204042.89698: sending task result for task 12b410aa-8751-3c74-8f8e-00000000001c ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10215 1727204042.89972: no more pending results, returning what we have 10215 1727204042.89976: results queue empty 10215 1727204042.89978: checking for any_errors_fatal 10215 1727204042.89992: done checking for any_errors_fatal 10215 1727204042.89993: checking for max_fail_percentage 10215 1727204042.89995: done checking for max_fail_percentage 10215 1727204042.89997: checking to see if all hosts have failed and the running result is not ok 10215 1727204042.89998: done checking to see if all hosts have failed 10215 1727204042.89999: getting the remaining hosts for this loop 10215 1727204042.90001: done getting the remaining hosts for this loop 10215 1727204042.90005: getting the next task for host managed-node3 10215 1727204042.90102: done getting next task for host managed-node3 10215 1727204042.90106: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 10215 1727204042.90110: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204042.90114: getting variables 10215 1727204042.90116: in VariableManager get_vars() 10215 1727204042.90176: Calling all_inventory to load vars for managed-node3 10215 1727204042.90179: Calling groups_inventory to load vars for managed-node3 10215 1727204042.90183: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204042.90248: Calling all_plugins_play to load vars for managed-node3 10215 1727204042.90253: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204042.90258: Calling groups_plugins_play to load vars for managed-node3 10215 1727204042.90792: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000001c 10215 1727204042.90796: WORKER PROCESS EXITING 10215 1727204042.90826: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204042.91547: done with get_vars() 10215 1727204042.91562: done getting variables 10215 1727204042.91762: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:28 Tuesday 24 September 2024 14:54:02 -0400 (0:00:00.081) 0:00:11.485 ***** 10215 1727204042.91801: entering _queue_task() for managed-node3/command 10215 1727204042.92616: worker is 1 (out of 1 available) 10215 1727204042.92712: exiting _queue_task() for managed-node3/command 10215 1727204042.92728: done queuing things up, now waiting for results queue to drain 10215 1727204042.92730: waiting for pending results... 10215 1727204042.93223: running TaskExecutor() for managed-node3/TASK: Backup the /etc/resolv.conf for initscript 10215 1727204042.93368: in run() - task 12b410aa-8751-3c74-8f8e-00000000001d 10215 1727204042.93405: variable 'ansible_search_path' from source: unknown 10215 1727204042.93454: calling self._execute() 10215 1727204042.93570: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.93591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.93617: variable 'omit' from source: magic vars 10215 1727204042.94115: variable 'ansible_distribution_major_version' from source: facts 10215 1727204042.94139: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204042.94320: variable 'network_provider' from source: set_fact 10215 1727204042.94334: Evaluated conditional (network_provider == "initscripts"): False 10215 1727204042.94381: when evaluation is False, skipping this task 10215 1727204042.94387: _execute() done 10215 1727204042.94389: dumping result to json 10215 1727204042.94392: done dumping result, returning 10215 1727204042.94395: done running TaskExecutor() for managed-node3/TASK: Backup the /etc/resolv.conf for initscript [12b410aa-8751-3c74-8f8e-00000000001d] 10215 1727204042.94399: sending task result for task 12b410aa-8751-3c74-8f8e-00000000001d 10215 1727204042.94682: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000001d 10215 1727204042.94685: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10215 1727204042.94751: no more pending results, returning what we have 10215 1727204042.94755: results queue empty 10215 1727204042.94756: checking for any_errors_fatal 10215 1727204042.94764: done checking for any_errors_fatal 10215 1727204042.94765: checking for max_fail_percentage 10215 1727204042.94767: done checking for max_fail_percentage 10215 1727204042.94769: checking to see if all hosts have failed and the running result is not ok 10215 1727204042.94770: done checking to see if all hosts have failed 10215 1727204042.94771: getting the remaining hosts for this loop 10215 1727204042.94772: done getting the remaining hosts for this loop 10215 1727204042.94777: getting the next task for host managed-node3 10215 1727204042.94784: done getting next task for host managed-node3 10215 1727204042.94786: ^ task is: TASK: TEST Add Bond with 2 ports 10215 1727204042.94963: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204042.94969: getting variables 10215 1727204042.94970: in VariableManager get_vars() 10215 1727204042.95019: Calling all_inventory to load vars for managed-node3 10215 1727204042.95022: Calling groups_inventory to load vars for managed-node3 10215 1727204042.95025: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204042.95038: Calling all_plugins_play to load vars for managed-node3 10215 1727204042.95042: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204042.95046: Calling groups_plugins_play to load vars for managed-node3 10215 1727204042.95309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204042.95626: done with get_vars() 10215 1727204042.95641: done getting variables 10215 1727204042.95762: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:33 Tuesday 24 September 2024 14:54:02 -0400 (0:00:00.039) 0:00:11.525 ***** 10215 1727204042.95796: entering _queue_task() for managed-node3/debug 10215 1727204042.96212: worker is 1 (out of 1 available) 10215 1727204042.96228: exiting _queue_task() for managed-node3/debug 10215 1727204042.96242: done queuing things up, now waiting for results queue to drain 10215 1727204042.96244: waiting for pending results... 10215 1727204042.96616: running TaskExecutor() for managed-node3/TASK: TEST Add Bond with 2 ports 10215 1727204042.96622: in run() - task 12b410aa-8751-3c74-8f8e-00000000001e 10215 1727204042.96636: variable 'ansible_search_path' from source: unknown 10215 1727204042.96815: calling self._execute() 10215 1727204042.96819: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.96822: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.96865: variable 'omit' from source: magic vars 10215 1727204042.97320: variable 'ansible_distribution_major_version' from source: facts 10215 1727204042.97339: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204042.97343: variable 'omit' from source: magic vars 10215 1727204042.97362: variable 'omit' from source: magic vars 10215 1727204042.97396: variable 'omit' from source: magic vars 10215 1727204042.97434: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204042.97467: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204042.97488: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204042.97510: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204042.97520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204042.97549: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204042.97553: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.97556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.97646: Set connection var ansible_connection to ssh 10215 1727204042.97653: Set connection var ansible_pipelining to False 10215 1727204042.97659: Set connection var ansible_shell_type to sh 10215 1727204042.97666: Set connection var ansible_timeout to 10 10215 1727204042.97673: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204042.97681: Set connection var ansible_shell_executable to /bin/sh 10215 1727204042.97709: variable 'ansible_shell_executable' from source: unknown 10215 1727204042.97713: variable 'ansible_connection' from source: unknown 10215 1727204042.97716: variable 'ansible_module_compression' from source: unknown 10215 1727204042.97721: variable 'ansible_shell_type' from source: unknown 10215 1727204042.97723: variable 'ansible_shell_executable' from source: unknown 10215 1727204042.97726: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.97728: variable 'ansible_pipelining' from source: unknown 10215 1727204042.97730: variable 'ansible_timeout' from source: unknown 10215 1727204042.97732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.97855: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204042.97865: variable 'omit' from source: magic vars 10215 1727204042.97871: starting attempt loop 10215 1727204042.97874: running the handler 10215 1727204042.97921: handler run complete 10215 1727204042.97940: attempt loop complete, returning result 10215 1727204042.97943: _execute() done 10215 1727204042.97946: dumping result to json 10215 1727204042.97952: done dumping result, returning 10215 1727204042.97959: done running TaskExecutor() for managed-node3/TASK: TEST Add Bond with 2 ports [12b410aa-8751-3c74-8f8e-00000000001e] 10215 1727204042.97965: sending task result for task 12b410aa-8751-3c74-8f8e-00000000001e 10215 1727204042.98057: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000001e 10215 1727204042.98062: WORKER PROCESS EXITING ok: [managed-node3] => {} MSG: ################################################## 10215 1727204042.98126: no more pending results, returning what we have 10215 1727204042.98129: results queue empty 10215 1727204042.98130: checking for any_errors_fatal 10215 1727204042.98135: done checking for any_errors_fatal 10215 1727204042.98136: checking for max_fail_percentage 10215 1727204042.98138: done checking for max_fail_percentage 10215 1727204042.98139: checking to see if all hosts have failed and the running result is not ok 10215 1727204042.98140: done checking to see if all hosts have failed 10215 1727204042.98141: getting the remaining hosts for this loop 10215 1727204042.98143: done getting the remaining hosts for this loop 10215 1727204042.98147: getting the next task for host managed-node3 10215 1727204042.98155: done getting next task for host managed-node3 10215 1727204042.98161: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10215 1727204042.98164: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204042.98187: getting variables 10215 1727204042.98191: in VariableManager get_vars() 10215 1727204042.98236: Calling all_inventory to load vars for managed-node3 10215 1727204042.98240: Calling groups_inventory to load vars for managed-node3 10215 1727204042.98242: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204042.98252: Calling all_plugins_play to load vars for managed-node3 10215 1727204042.98255: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204042.98259: Calling groups_plugins_play to load vars for managed-node3 10215 1727204042.98452: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204042.98621: done with get_vars() 10215 1727204042.98631: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:02 -0400 (0:00:00.029) 0:00:11.554 ***** 10215 1727204042.98711: entering _queue_task() for managed-node3/include_tasks 10215 1727204042.99004: worker is 1 (out of 1 available) 10215 1727204042.99018: exiting _queue_task() for managed-node3/include_tasks 10215 1727204042.99031: done queuing things up, now waiting for results queue to drain 10215 1727204042.99033: waiting for pending results... 10215 1727204042.99347: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10215 1727204042.99482: in run() - task 12b410aa-8751-3c74-8f8e-000000000026 10215 1727204042.99502: variable 'ansible_search_path' from source: unknown 10215 1727204042.99506: variable 'ansible_search_path' from source: unknown 10215 1727204042.99551: calling self._execute() 10215 1727204042.99645: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204042.99657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204042.99664: variable 'omit' from source: magic vars 10215 1727204043.00114: variable 'ansible_distribution_major_version' from source: facts 10215 1727204043.00127: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204043.00136: _execute() done 10215 1727204043.00139: dumping result to json 10215 1727204043.00144: done dumping result, returning 10215 1727204043.00154: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-3c74-8f8e-000000000026] 10215 1727204043.00161: sending task result for task 12b410aa-8751-3c74-8f8e-000000000026 10215 1727204043.00326: no more pending results, returning what we have 10215 1727204043.00335: in VariableManager get_vars() 10215 1727204043.00397: Calling all_inventory to load vars for managed-node3 10215 1727204043.00401: Calling groups_inventory to load vars for managed-node3 10215 1727204043.00404: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204043.00420: Calling all_plugins_play to load vars for managed-node3 10215 1727204043.00425: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204043.00430: Calling groups_plugins_play to load vars for managed-node3 10215 1727204043.00872: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000026 10215 1727204043.00877: WORKER PROCESS EXITING 10215 1727204043.00910: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204043.01194: done with get_vars() 10215 1727204043.01205: variable 'ansible_search_path' from source: unknown 10215 1727204043.01207: variable 'ansible_search_path' from source: unknown 10215 1727204043.01256: we have included files to process 10215 1727204043.01257: generating all_blocks data 10215 1727204043.01259: done generating all_blocks data 10215 1727204043.01265: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10215 1727204043.01266: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10215 1727204043.01268: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10215 1727204043.02220: done processing included file 10215 1727204043.02223: iterating over new_blocks loaded from include file 10215 1727204043.02224: in VariableManager get_vars() 10215 1727204043.02259: done with get_vars() 10215 1727204043.02262: filtering new block on tags 10215 1727204043.02285: done filtering new block on tags 10215 1727204043.02288: in VariableManager get_vars() 10215 1727204043.02331: done with get_vars() 10215 1727204043.02333: filtering new block on tags 10215 1727204043.02362: done filtering new block on tags 10215 1727204043.02366: in VariableManager get_vars() 10215 1727204043.02397: done with get_vars() 10215 1727204043.02399: filtering new block on tags 10215 1727204043.02425: done filtering new block on tags 10215 1727204043.02436: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 10215 1727204043.02442: extending task lists for all hosts with included blocks 10215 1727204043.03466: done extending task lists 10215 1727204043.03468: done processing included files 10215 1727204043.03471: results queue empty 10215 1727204043.03471: checking for any_errors_fatal 10215 1727204043.03474: done checking for any_errors_fatal 10215 1727204043.03475: checking for max_fail_percentage 10215 1727204043.03476: done checking for max_fail_percentage 10215 1727204043.03476: checking to see if all hosts have failed and the running result is not ok 10215 1727204043.03477: done checking to see if all hosts have failed 10215 1727204043.03478: getting the remaining hosts for this loop 10215 1727204043.03479: done getting the remaining hosts for this loop 10215 1727204043.03481: getting the next task for host managed-node3 10215 1727204043.03484: done getting next task for host managed-node3 10215 1727204043.03487: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10215 1727204043.03491: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204043.03499: getting variables 10215 1727204043.03500: in VariableManager get_vars() 10215 1727204043.03515: Calling all_inventory to load vars for managed-node3 10215 1727204043.03517: Calling groups_inventory to load vars for managed-node3 10215 1727204043.03520: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204043.03525: Calling all_plugins_play to load vars for managed-node3 10215 1727204043.03528: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204043.03530: Calling groups_plugins_play to load vars for managed-node3 10215 1727204043.03668: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204043.03843: done with get_vars() 10215 1727204043.03853: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:03 -0400 (0:00:00.052) 0:00:11.606 ***** 10215 1727204043.03920: entering _queue_task() for managed-node3/setup 10215 1727204043.04191: worker is 1 (out of 1 available) 10215 1727204043.04206: exiting _queue_task() for managed-node3/setup 10215 1727204043.04219: done queuing things up, now waiting for results queue to drain 10215 1727204043.04221: waiting for pending results... 10215 1727204043.04417: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10215 1727204043.04530: in run() - task 12b410aa-8751-3c74-8f8e-000000000188 10215 1727204043.04543: variable 'ansible_search_path' from source: unknown 10215 1727204043.04547: variable 'ansible_search_path' from source: unknown 10215 1727204043.04582: calling self._execute() 10215 1727204043.04650: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204043.04656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204043.04668: variable 'omit' from source: magic vars 10215 1727204043.05067: variable 'ansible_distribution_major_version' from source: facts 10215 1727204043.05071: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204043.05360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204043.08306: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204043.08494: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204043.08500: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204043.08549: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204043.08586: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204043.08714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204043.08773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204043.08834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204043.08897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204043.08943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204043.09022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204043.09100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204043.09119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204043.09183: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204043.09295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204043.09474: variable '__network_required_facts' from source: role '' defaults 10215 1727204043.09494: variable 'ansible_facts' from source: unknown 10215 1727204043.09647: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10215 1727204043.09657: when evaluation is False, skipping this task 10215 1727204043.09666: _execute() done 10215 1727204043.09674: dumping result to json 10215 1727204043.09682: done dumping result, returning 10215 1727204043.09700: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-3c74-8f8e-000000000188] 10215 1727204043.09717: sending task result for task 12b410aa-8751-3c74-8f8e-000000000188 skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10215 1727204043.10023: no more pending results, returning what we have 10215 1727204043.10029: results queue empty 10215 1727204043.10031: checking for any_errors_fatal 10215 1727204043.10033: done checking for any_errors_fatal 10215 1727204043.10034: checking for max_fail_percentage 10215 1727204043.10036: done checking for max_fail_percentage 10215 1727204043.10038: checking to see if all hosts have failed and the running result is not ok 10215 1727204043.10039: done checking to see if all hosts have failed 10215 1727204043.10040: getting the remaining hosts for this loop 10215 1727204043.10042: done getting the remaining hosts for this loop 10215 1727204043.10047: getting the next task for host managed-node3 10215 1727204043.10060: done getting next task for host managed-node3 10215 1727204043.10064: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 10215 1727204043.10069: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204043.10086: getting variables 10215 1727204043.10088: in VariableManager get_vars() 10215 1727204043.10146: Calling all_inventory to load vars for managed-node3 10215 1727204043.10150: Calling groups_inventory to load vars for managed-node3 10215 1727204043.10153: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204043.10168: Calling all_plugins_play to load vars for managed-node3 10215 1727204043.10172: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204043.10176: Calling groups_plugins_play to load vars for managed-node3 10215 1727204043.10686: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000188 10215 1727204043.10693: WORKER PROCESS EXITING 10215 1727204043.10736: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204043.11087: done with get_vars() 10215 1727204043.11106: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:03 -0400 (0:00:00.073) 0:00:11.679 ***** 10215 1727204043.11252: entering _queue_task() for managed-node3/stat 10215 1727204043.11739: worker is 1 (out of 1 available) 10215 1727204043.11754: exiting _queue_task() for managed-node3/stat 10215 1727204043.11768: done queuing things up, now waiting for results queue to drain 10215 1727204043.11770: waiting for pending results... 10215 1727204043.12006: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 10215 1727204043.12238: in run() - task 12b410aa-8751-3c74-8f8e-00000000018a 10215 1727204043.12254: variable 'ansible_search_path' from source: unknown 10215 1727204043.12258: variable 'ansible_search_path' from source: unknown 10215 1727204043.12309: calling self._execute() 10215 1727204043.12404: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204043.12425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204043.12516: variable 'omit' from source: magic vars 10215 1727204043.12959: variable 'ansible_distribution_major_version' from source: facts 10215 1727204043.12982: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204043.13157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204043.13454: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204043.13494: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204043.13527: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204043.13558: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204043.13638: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204043.13658: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204043.13680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204043.13704: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204043.13781: variable '__network_is_ostree' from source: set_fact 10215 1727204043.13791: Evaluated conditional (not __network_is_ostree is defined): False 10215 1727204043.13794: when evaluation is False, skipping this task 10215 1727204043.13797: _execute() done 10215 1727204043.13802: dumping result to json 10215 1727204043.13806: done dumping result, returning 10215 1727204043.13819: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-3c74-8f8e-00000000018a] 10215 1727204043.13827: sending task result for task 12b410aa-8751-3c74-8f8e-00000000018a 10215 1727204043.13920: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000018a 10215 1727204043.13923: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10215 1727204043.13983: no more pending results, returning what we have 10215 1727204043.13988: results queue empty 10215 1727204043.13991: checking for any_errors_fatal 10215 1727204043.13997: done checking for any_errors_fatal 10215 1727204043.13998: checking for max_fail_percentage 10215 1727204043.14000: done checking for max_fail_percentage 10215 1727204043.14001: checking to see if all hosts have failed and the running result is not ok 10215 1727204043.14003: done checking to see if all hosts have failed 10215 1727204043.14003: getting the remaining hosts for this loop 10215 1727204043.14005: done getting the remaining hosts for this loop 10215 1727204043.14010: getting the next task for host managed-node3 10215 1727204043.14018: done getting next task for host managed-node3 10215 1727204043.14021: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10215 1727204043.14026: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204043.14041: getting variables 10215 1727204043.14042: in VariableManager get_vars() 10215 1727204043.14083: Calling all_inventory to load vars for managed-node3 10215 1727204043.14086: Calling groups_inventory to load vars for managed-node3 10215 1727204043.14096: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204043.14108: Calling all_plugins_play to load vars for managed-node3 10215 1727204043.14111: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204043.14115: Calling groups_plugins_play to load vars for managed-node3 10215 1727204043.14319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204043.14487: done with get_vars() 10215 1727204043.14498: done getting variables 10215 1727204043.14551: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:03 -0400 (0:00:00.033) 0:00:11.712 ***** 10215 1727204043.14580: entering _queue_task() for managed-node3/set_fact 10215 1727204043.14805: worker is 1 (out of 1 available) 10215 1727204043.14820: exiting _queue_task() for managed-node3/set_fact 10215 1727204043.14833: done queuing things up, now waiting for results queue to drain 10215 1727204043.14835: waiting for pending results... 10215 1727204043.15001: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10215 1727204043.15116: in run() - task 12b410aa-8751-3c74-8f8e-00000000018b 10215 1727204043.15129: variable 'ansible_search_path' from source: unknown 10215 1727204043.15132: variable 'ansible_search_path' from source: unknown 10215 1727204043.15167: calling self._execute() 10215 1727204043.15233: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204043.15239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204043.15249: variable 'omit' from source: magic vars 10215 1727204043.15559: variable 'ansible_distribution_major_version' from source: facts 10215 1727204043.15569: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204043.15713: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204043.15932: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204043.15973: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204043.16005: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204043.16036: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204043.16114: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204043.16135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204043.16158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204043.16184: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204043.16260: variable '__network_is_ostree' from source: set_fact 10215 1727204043.16266: Evaluated conditional (not __network_is_ostree is defined): False 10215 1727204043.16271: when evaluation is False, skipping this task 10215 1727204043.16274: _execute() done 10215 1727204043.16277: dumping result to json 10215 1727204043.16289: done dumping result, returning 10215 1727204043.16294: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-3c74-8f8e-00000000018b] 10215 1727204043.16299: sending task result for task 12b410aa-8751-3c74-8f8e-00000000018b 10215 1727204043.16386: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000018b 10215 1727204043.16394: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10215 1727204043.16455: no more pending results, returning what we have 10215 1727204043.16459: results queue empty 10215 1727204043.16460: checking for any_errors_fatal 10215 1727204043.16466: done checking for any_errors_fatal 10215 1727204043.16467: checking for max_fail_percentage 10215 1727204043.16469: done checking for max_fail_percentage 10215 1727204043.16470: checking to see if all hosts have failed and the running result is not ok 10215 1727204043.16471: done checking to see if all hosts have failed 10215 1727204043.16472: getting the remaining hosts for this loop 10215 1727204043.16474: done getting the remaining hosts for this loop 10215 1727204043.16478: getting the next task for host managed-node3 10215 1727204043.16487: done getting next task for host managed-node3 10215 1727204043.16493: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 10215 1727204043.16497: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204043.16516: getting variables 10215 1727204043.16518: in VariableManager get_vars() 10215 1727204043.16558: Calling all_inventory to load vars for managed-node3 10215 1727204043.16561: Calling groups_inventory to load vars for managed-node3 10215 1727204043.16564: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204043.16575: Calling all_plugins_play to load vars for managed-node3 10215 1727204043.16577: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204043.16581: Calling groups_plugins_play to load vars for managed-node3 10215 1727204043.16728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204043.16899: done with get_vars() 10215 1727204043.16909: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:03 -0400 (0:00:00.024) 0:00:11.737 ***** 10215 1727204043.16987: entering _queue_task() for managed-node3/service_facts 10215 1727204043.16988: Creating lock for service_facts 10215 1727204043.17223: worker is 1 (out of 1 available) 10215 1727204043.17238: exiting _queue_task() for managed-node3/service_facts 10215 1727204043.17251: done queuing things up, now waiting for results queue to drain 10215 1727204043.17253: waiting for pending results... 10215 1727204043.17431: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 10215 1727204043.17537: in run() - task 12b410aa-8751-3c74-8f8e-00000000018d 10215 1727204043.17551: variable 'ansible_search_path' from source: unknown 10215 1727204043.17555: variable 'ansible_search_path' from source: unknown 10215 1727204043.17592: calling self._execute() 10215 1727204043.17657: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204043.17664: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204043.17674: variable 'omit' from source: magic vars 10215 1727204043.18048: variable 'ansible_distribution_major_version' from source: facts 10215 1727204043.18059: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204043.18066: variable 'omit' from source: magic vars 10215 1727204043.18127: variable 'omit' from source: magic vars 10215 1727204043.18160: variable 'omit' from source: magic vars 10215 1727204043.18195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204043.18229: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204043.18252: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204043.18270: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204043.18281: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204043.18310: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204043.18316: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204043.18321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204043.18407: Set connection var ansible_connection to ssh 10215 1727204043.18416: Set connection var ansible_pipelining to False 10215 1727204043.18422: Set connection var ansible_shell_type to sh 10215 1727204043.18429: Set connection var ansible_timeout to 10 10215 1727204043.18436: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204043.18445: Set connection var ansible_shell_executable to /bin/sh 10215 1727204043.18464: variable 'ansible_shell_executable' from source: unknown 10215 1727204043.18467: variable 'ansible_connection' from source: unknown 10215 1727204043.18472: variable 'ansible_module_compression' from source: unknown 10215 1727204043.18476: variable 'ansible_shell_type' from source: unknown 10215 1727204043.18478: variable 'ansible_shell_executable' from source: unknown 10215 1727204043.18481: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204043.18486: variable 'ansible_pipelining' from source: unknown 10215 1727204043.18491: variable 'ansible_timeout' from source: unknown 10215 1727204043.18497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204043.18666: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204043.18676: variable 'omit' from source: magic vars 10215 1727204043.18679: starting attempt loop 10215 1727204043.18682: running the handler 10215 1727204043.18701: _low_level_execute_command(): starting 10215 1727204043.18708: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204043.19258: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204043.19261: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204043.19266: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204043.19269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204043.19326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204043.19330: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204043.19332: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204043.19386: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204043.21160: stdout chunk (state=3): >>>/root <<< 10215 1727204043.21269: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204043.21338: stderr chunk (state=3): >>><<< 10215 1727204043.21342: stdout chunk (state=3): >>><<< 10215 1727204043.21363: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204043.21376: _low_level_execute_command(): starting 10215 1727204043.21383: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112 `" && echo ansible-tmp-1727204043.2136393-11197-87602958407112="` echo /root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112 `" ) && sleep 0' 10215 1727204043.21856: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204043.21899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204043.21903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204043.21917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204043.21920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204043.21965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204043.21969: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204043.21973: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204043.22016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204043.23993: stdout chunk (state=3): >>>ansible-tmp-1727204043.2136393-11197-87602958407112=/root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112 <<< 10215 1727204043.24108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204043.24175: stderr chunk (state=3): >>><<< 10215 1727204043.24179: stdout chunk (state=3): >>><<< 10215 1727204043.24199: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204043.2136393-11197-87602958407112=/root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204043.24246: variable 'ansible_module_compression' from source: unknown 10215 1727204043.24292: ANSIBALLZ: Using lock for service_facts 10215 1727204043.24296: ANSIBALLZ: Acquiring lock 10215 1727204043.24299: ANSIBALLZ: Lock acquired: 139878725086880 10215 1727204043.24302: ANSIBALLZ: Creating module 10215 1727204043.35306: ANSIBALLZ: Writing module into payload 10215 1727204043.35396: ANSIBALLZ: Writing module 10215 1727204043.35420: ANSIBALLZ: Renaming module 10215 1727204043.35427: ANSIBALLZ: Done creating module 10215 1727204043.35443: variable 'ansible_facts' from source: unknown 10215 1727204043.35502: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112/AnsiballZ_service_facts.py 10215 1727204043.35629: Sending initial data 10215 1727204043.35633: Sent initial data (161 bytes) 10215 1727204043.36094: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204043.36132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204043.36135: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204043.36138: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10215 1727204043.36141: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204043.36143: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204043.36206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204043.36209: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204043.36216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204043.36258: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204043.37966: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204043.37998: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204043.38032: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmplje76qm9 /root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112/AnsiballZ_service_facts.py <<< 10215 1727204043.38036: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112/AnsiballZ_service_facts.py" <<< 10215 1727204043.38067: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmplje76qm9" to remote "/root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112/AnsiballZ_service_facts.py" <<< 10215 1727204043.38073: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112/AnsiballZ_service_facts.py" <<< 10215 1727204043.38870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204043.38949: stderr chunk (state=3): >>><<< 10215 1727204043.38953: stdout chunk (state=3): >>><<< 10215 1727204043.38975: done transferring module to remote 10215 1727204043.38988: _low_level_execute_command(): starting 10215 1727204043.38996: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112/ /root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112/AnsiballZ_service_facts.py && sleep 0' 10215 1727204043.39455: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204043.39465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204043.39494: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204043.39503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204043.39506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204043.39559: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204043.39563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204043.39607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204043.41443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204043.41506: stderr chunk (state=3): >>><<< 10215 1727204043.41513: stdout chunk (state=3): >>><<< 10215 1727204043.41525: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204043.41528: _low_level_execute_command(): starting 10215 1727204043.41535: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112/AnsiballZ_service_facts.py && sleep 0' 10215 1727204043.42040: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204043.42044: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204043.42047: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204043.42049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204043.42096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204043.42114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204043.42117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204043.42178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204045.36032: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 10215 1727204045.36103: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name<<< 10215 1727204045.36116: stdout chunk (state=3): >>>": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactiv<<< 10215 1727204045.36132: stdout chunk (state=3): >>>e", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": <<< 10215 1727204045.36155: stdout chunk (state=3): >>>"systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 10215 1727204045.37837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204045.37841: stdout chunk (state=3): >>><<< 10215 1727204045.37843: stderr chunk (state=3): >>><<< 10215 1727204045.37998: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204045.40386: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204045.40413: _low_level_execute_command(): starting 10215 1727204045.40513: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204043.2136393-11197-87602958407112/ > /dev/null 2>&1 && sleep 0' 10215 1727204045.41136: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204045.41201: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204045.41280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204045.41324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204045.41400: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204045.43373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204045.43377: stdout chunk (state=3): >>><<< 10215 1727204045.43598: stderr chunk (state=3): >>><<< 10215 1727204045.43602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204045.43605: handler run complete 10215 1727204045.43713: variable 'ansible_facts' from source: unknown 10215 1727204045.43956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204045.44783: variable 'ansible_facts' from source: unknown 10215 1727204045.45023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204045.45406: attempt loop complete, returning result 10215 1727204045.45412: _execute() done 10215 1727204045.45418: dumping result to json 10215 1727204045.45510: done dumping result, returning 10215 1727204045.45521: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-3c74-8f8e-00000000018d] 10215 1727204045.45528: sending task result for task 12b410aa-8751-3c74-8f8e-00000000018d ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10215 1727204045.46819: no more pending results, returning what we have 10215 1727204045.46822: results queue empty 10215 1727204045.46823: checking for any_errors_fatal 10215 1727204045.46828: done checking for any_errors_fatal 10215 1727204045.46829: checking for max_fail_percentage 10215 1727204045.46830: done checking for max_fail_percentage 10215 1727204045.46831: checking to see if all hosts have failed and the running result is not ok 10215 1727204045.46832: done checking to see if all hosts have failed 10215 1727204045.46833: getting the remaining hosts for this loop 10215 1727204045.46835: done getting the remaining hosts for this loop 10215 1727204045.46839: getting the next task for host managed-node3 10215 1727204045.46845: done getting next task for host managed-node3 10215 1727204045.46848: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 10215 1727204045.46853: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204045.46863: getting variables 10215 1727204045.46864: in VariableManager get_vars() 10215 1727204045.46902: Calling all_inventory to load vars for managed-node3 10215 1727204045.46905: Calling groups_inventory to load vars for managed-node3 10215 1727204045.46908: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204045.46919: Calling all_plugins_play to load vars for managed-node3 10215 1727204045.46922: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204045.46926: Calling groups_plugins_play to load vars for managed-node3 10215 1727204045.47506: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000018d 10215 1727204045.47510: WORKER PROCESS EXITING 10215 1727204045.47655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204045.48454: done with get_vars() 10215 1727204045.48471: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:54:05 -0400 (0:00:02.315) 0:00:14.053 ***** 10215 1727204045.48592: entering _queue_task() for managed-node3/package_facts 10215 1727204045.48594: Creating lock for package_facts 10215 1727204045.48938: worker is 1 (out of 1 available) 10215 1727204045.49066: exiting _queue_task() for managed-node3/package_facts 10215 1727204045.49077: done queuing things up, now waiting for results queue to drain 10215 1727204045.49078: waiting for pending results... 10215 1727204045.49277: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 10215 1727204045.49454: in run() - task 12b410aa-8751-3c74-8f8e-00000000018e 10215 1727204045.49477: variable 'ansible_search_path' from source: unknown 10215 1727204045.49485: variable 'ansible_search_path' from source: unknown 10215 1727204045.49536: calling self._execute() 10215 1727204045.49635: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204045.49651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204045.49668: variable 'omit' from source: magic vars 10215 1727204045.50154: variable 'ansible_distribution_major_version' from source: facts 10215 1727204045.50158: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204045.50160: variable 'omit' from source: magic vars 10215 1727204045.50245: variable 'omit' from source: magic vars 10215 1727204045.50302: variable 'omit' from source: magic vars 10215 1727204045.50351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204045.50408: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204045.50437: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204045.50465: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204045.50497: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204045.50538: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204045.50547: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204045.50556: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204045.50687: Set connection var ansible_connection to ssh 10215 1727204045.50795: Set connection var ansible_pipelining to False 10215 1727204045.50798: Set connection var ansible_shell_type to sh 10215 1727204045.50808: Set connection var ansible_timeout to 10 10215 1727204045.50810: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204045.50813: Set connection var ansible_shell_executable to /bin/sh 10215 1727204045.50815: variable 'ansible_shell_executable' from source: unknown 10215 1727204045.50817: variable 'ansible_connection' from source: unknown 10215 1727204045.50820: variable 'ansible_module_compression' from source: unknown 10215 1727204045.50822: variable 'ansible_shell_type' from source: unknown 10215 1727204045.50824: variable 'ansible_shell_executable' from source: unknown 10215 1727204045.50826: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204045.50828: variable 'ansible_pipelining' from source: unknown 10215 1727204045.50831: variable 'ansible_timeout' from source: unknown 10215 1727204045.50842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204045.51104: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204045.51124: variable 'omit' from source: magic vars 10215 1727204045.51142: starting attempt loop 10215 1727204045.51150: running the handler 10215 1727204045.51171: _low_level_execute_command(): starting 10215 1727204045.51184: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204045.52025: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204045.52112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204045.52142: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204045.52216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204045.53923: stdout chunk (state=3): >>>/root <<< 10215 1727204045.54138: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204045.54142: stdout chunk (state=3): >>><<< 10215 1727204045.54145: stderr chunk (state=3): >>><<< 10215 1727204045.54293: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204045.54298: _low_level_execute_command(): starting 10215 1727204045.54301: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859 `" && echo ansible-tmp-1727204045.5417407-11244-24330821932859="` echo /root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859 `" ) && sleep 0' 10215 1727204045.54911: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204045.54926: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204045.54943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204045.54976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204045.55086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204045.55148: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204045.55180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204045.57202: stdout chunk (state=3): >>>ansible-tmp-1727204045.5417407-11244-24330821932859=/root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859 <<< 10215 1727204045.57274: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204045.57500: stderr chunk (state=3): >>><<< 10215 1727204045.57504: stdout chunk (state=3): >>><<< 10215 1727204045.57509: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204045.5417407-11244-24330821932859=/root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204045.57513: variable 'ansible_module_compression' from source: unknown 10215 1727204045.57515: ANSIBALLZ: Using lock for package_facts 10215 1727204045.57517: ANSIBALLZ: Acquiring lock 10215 1727204045.57519: ANSIBALLZ: Lock acquired: 139878722618624 10215 1727204045.57521: ANSIBALLZ: Creating module 10215 1727204045.85647: ANSIBALLZ: Writing module into payload 10215 1727204045.85772: ANSIBALLZ: Writing module 10215 1727204045.85804: ANSIBALLZ: Renaming module 10215 1727204045.85813: ANSIBALLZ: Done creating module 10215 1727204045.85848: variable 'ansible_facts' from source: unknown 10215 1727204045.85994: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859/AnsiballZ_package_facts.py 10215 1727204045.86124: Sending initial data 10215 1727204045.86128: Sent initial data (161 bytes) 10215 1727204045.86586: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204045.86633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204045.86637: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204045.86639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204045.86642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204045.86644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204045.86646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204045.86699: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204045.86702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204045.86704: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204045.86770: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204045.88470: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204045.88505: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204045.88540: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpt36cu4po /root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859/AnsiballZ_package_facts.py <<< 10215 1727204045.88544: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859/AnsiballZ_package_facts.py" <<< 10215 1727204045.88575: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpt36cu4po" to remote "/root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859/AnsiballZ_package_facts.py" <<< 10215 1727204045.90224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204045.90286: stderr chunk (state=3): >>><<< 10215 1727204045.90292: stdout chunk (state=3): >>><<< 10215 1727204045.90315: done transferring module to remote 10215 1727204045.90327: _low_level_execute_command(): starting 10215 1727204045.90330: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859/ /root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859/AnsiballZ_package_facts.py && sleep 0' 10215 1727204045.90798: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204045.90802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204045.90805: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204045.90810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204045.90859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204045.90868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204045.90901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204045.92738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204045.92784: stderr chunk (state=3): >>><<< 10215 1727204045.92788: stdout chunk (state=3): >>><<< 10215 1727204045.92805: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204045.92811: _low_level_execute_command(): starting 10215 1727204045.92814: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859/AnsiballZ_package_facts.py && sleep 0' 10215 1727204045.93253: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204045.93295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204045.93300: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204045.93303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204045.93305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204045.93307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204045.93361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204045.93364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204045.93404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204046.56617: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 10215 1727204046.56844: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 10215 1727204046.56863: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 10215 1727204046.56870: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 10215 1727204046.56924: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 10215 1727204046.56931: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 10215 1727204046.56934: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 10215 1727204046.58745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204046.58804: stderr chunk (state=3): >>><<< 10215 1727204046.58807: stdout chunk (state=3): >>><<< 10215 1727204046.58850: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204046.62406: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204046.62429: _low_level_execute_command(): starting 10215 1727204046.62434: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204045.5417407-11244-24330821932859/ > /dev/null 2>&1 && sleep 0' 10215 1727204046.62893: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204046.62897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204046.62899: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204046.62902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204046.62962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204046.62968: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204046.63015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204046.64960: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204046.65003: stderr chunk (state=3): >>><<< 10215 1727204046.65007: stdout chunk (state=3): >>><<< 10215 1727204046.65023: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204046.65030: handler run complete 10215 1727204046.66265: variable 'ansible_facts' from source: unknown 10215 1727204046.66804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204046.69297: variable 'ansible_facts' from source: unknown 10215 1727204046.69709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204046.71145: attempt loop complete, returning result 10215 1727204046.71167: _execute() done 10215 1727204046.71170: dumping result to json 10215 1727204046.71491: done dumping result, returning 10215 1727204046.71505: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-3c74-8f8e-00000000018e] 10215 1727204046.71511: sending task result for task 12b410aa-8751-3c74-8f8e-00000000018e 10215 1727204046.75024: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000018e 10215 1727204046.75028: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10215 1727204046.75132: no more pending results, returning what we have 10215 1727204046.75135: results queue empty 10215 1727204046.75136: checking for any_errors_fatal 10215 1727204046.75141: done checking for any_errors_fatal 10215 1727204046.75142: checking for max_fail_percentage 10215 1727204046.75144: done checking for max_fail_percentage 10215 1727204046.75145: checking to see if all hosts have failed and the running result is not ok 10215 1727204046.75145: done checking to see if all hosts have failed 10215 1727204046.75146: getting the remaining hosts for this loop 10215 1727204046.75148: done getting the remaining hosts for this loop 10215 1727204046.75152: getting the next task for host managed-node3 10215 1727204046.75158: done getting next task for host managed-node3 10215 1727204046.75162: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 10215 1727204046.75165: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204046.75177: getting variables 10215 1727204046.75179: in VariableManager get_vars() 10215 1727204046.75217: Calling all_inventory to load vars for managed-node3 10215 1727204046.75220: Calling groups_inventory to load vars for managed-node3 10215 1727204046.75222: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204046.75232: Calling all_plugins_play to load vars for managed-node3 10215 1727204046.75235: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204046.75239: Calling groups_plugins_play to load vars for managed-node3 10215 1727204046.77433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204046.80560: done with get_vars() 10215 1727204046.80597: done getting variables 10215 1727204046.80672: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:06 -0400 (0:00:01.321) 0:00:15.374 ***** 10215 1727204046.80716: entering _queue_task() for managed-node3/debug 10215 1727204046.81296: worker is 1 (out of 1 available) 10215 1727204046.81306: exiting _queue_task() for managed-node3/debug 10215 1727204046.81318: done queuing things up, now waiting for results queue to drain 10215 1727204046.81320: waiting for pending results... 10215 1727204046.81422: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 10215 1727204046.81540: in run() - task 12b410aa-8751-3c74-8f8e-000000000027 10215 1727204046.81565: variable 'ansible_search_path' from source: unknown 10215 1727204046.81573: variable 'ansible_search_path' from source: unknown 10215 1727204046.81626: calling self._execute() 10215 1727204046.81734: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204046.81764: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204046.81767: variable 'omit' from source: magic vars 10215 1727204046.82223: variable 'ansible_distribution_major_version' from source: facts 10215 1727204046.82242: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204046.82278: variable 'omit' from source: magic vars 10215 1727204046.82342: variable 'omit' from source: magic vars 10215 1727204046.82475: variable 'network_provider' from source: set_fact 10215 1727204046.82506: variable 'omit' from source: magic vars 10215 1727204046.82610: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204046.82613: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204046.82641: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204046.82664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204046.82680: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204046.82726: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204046.82737: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204046.82748: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204046.82896: Set connection var ansible_connection to ssh 10215 1727204046.82934: Set connection var ansible_pipelining to False 10215 1727204046.82938: Set connection var ansible_shell_type to sh 10215 1727204046.82940: Set connection var ansible_timeout to 10 10215 1727204046.82947: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204046.82968: Set connection var ansible_shell_executable to /bin/sh 10215 1727204046.83001: variable 'ansible_shell_executable' from source: unknown 10215 1727204046.83043: variable 'ansible_connection' from source: unknown 10215 1727204046.83046: variable 'ansible_module_compression' from source: unknown 10215 1727204046.83048: variable 'ansible_shell_type' from source: unknown 10215 1727204046.83051: variable 'ansible_shell_executable' from source: unknown 10215 1727204046.83053: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204046.83055: variable 'ansible_pipelining' from source: unknown 10215 1727204046.83057: variable 'ansible_timeout' from source: unknown 10215 1727204046.83152: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204046.83264: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204046.83288: variable 'omit' from source: magic vars 10215 1727204046.83303: starting attempt loop 10215 1727204046.83371: running the handler 10215 1727204046.83378: handler run complete 10215 1727204046.83412: attempt loop complete, returning result 10215 1727204046.83422: _execute() done 10215 1727204046.83431: dumping result to json 10215 1727204046.83439: done dumping result, returning 10215 1727204046.83452: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-3c74-8f8e-000000000027] 10215 1727204046.83464: sending task result for task 12b410aa-8751-3c74-8f8e-000000000027 ok: [managed-node3] => {} MSG: Using network provider: nm 10215 1727204046.83666: no more pending results, returning what we have 10215 1727204046.83671: results queue empty 10215 1727204046.83673: checking for any_errors_fatal 10215 1727204046.83685: done checking for any_errors_fatal 10215 1727204046.83687: checking for max_fail_percentage 10215 1727204046.83688: done checking for max_fail_percentage 10215 1727204046.83692: checking to see if all hosts have failed and the running result is not ok 10215 1727204046.83693: done checking to see if all hosts have failed 10215 1727204046.83694: getting the remaining hosts for this loop 10215 1727204046.83696: done getting the remaining hosts for this loop 10215 1727204046.83702: getting the next task for host managed-node3 10215 1727204046.83714: done getting next task for host managed-node3 10215 1727204046.83718: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10215 1727204046.83722: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204046.83736: getting variables 10215 1727204046.83738: in VariableManager get_vars() 10215 1727204046.83783: Calling all_inventory to load vars for managed-node3 10215 1727204046.83787: Calling groups_inventory to load vars for managed-node3 10215 1727204046.84010: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204046.84022: Calling all_plugins_play to load vars for managed-node3 10215 1727204046.84026: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204046.84030: Calling groups_plugins_play to load vars for managed-node3 10215 1727204046.84624: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000027 10215 1727204046.84628: WORKER PROCESS EXITING 10215 1727204046.86217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204046.87787: done with get_vars() 10215 1727204046.87822: done getting variables 10215 1727204046.87916: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.072) 0:00:15.446 ***** 10215 1727204046.87960: entering _queue_task() for managed-node3/fail 10215 1727204046.87962: Creating lock for fail 10215 1727204046.88294: worker is 1 (out of 1 available) 10215 1727204046.88312: exiting _queue_task() for managed-node3/fail 10215 1727204046.88327: done queuing things up, now waiting for results queue to drain 10215 1727204046.88330: waiting for pending results... 10215 1727204046.88622: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10215 1727204046.88782: in run() - task 12b410aa-8751-3c74-8f8e-000000000028 10215 1727204046.88810: variable 'ansible_search_path' from source: unknown 10215 1727204046.88821: variable 'ansible_search_path' from source: unknown 10215 1727204046.88872: calling self._execute() 10215 1727204046.88973: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204046.88988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204046.89010: variable 'omit' from source: magic vars 10215 1727204046.89355: variable 'ansible_distribution_major_version' from source: facts 10215 1727204046.89366: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204046.89470: variable 'network_state' from source: role '' defaults 10215 1727204046.89483: Evaluated conditional (network_state != {}): False 10215 1727204046.89488: when evaluation is False, skipping this task 10215 1727204046.89491: _execute() done 10215 1727204046.89494: dumping result to json 10215 1727204046.89496: done dumping result, returning 10215 1727204046.89507: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-3c74-8f8e-000000000028] 10215 1727204046.89513: sending task result for task 12b410aa-8751-3c74-8f8e-000000000028 10215 1727204046.89610: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000028 10215 1727204046.89613: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10215 1727204046.89669: no more pending results, returning what we have 10215 1727204046.89673: results queue empty 10215 1727204046.89674: checking for any_errors_fatal 10215 1727204046.89680: done checking for any_errors_fatal 10215 1727204046.89681: checking for max_fail_percentage 10215 1727204046.89683: done checking for max_fail_percentage 10215 1727204046.89684: checking to see if all hosts have failed and the running result is not ok 10215 1727204046.89685: done checking to see if all hosts have failed 10215 1727204046.89685: getting the remaining hosts for this loop 10215 1727204046.89687: done getting the remaining hosts for this loop 10215 1727204046.89693: getting the next task for host managed-node3 10215 1727204046.89700: done getting next task for host managed-node3 10215 1727204046.89704: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10215 1727204046.89707: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204046.89724: getting variables 10215 1727204046.89726: in VariableManager get_vars() 10215 1727204046.89762: Calling all_inventory to load vars for managed-node3 10215 1727204046.89765: Calling groups_inventory to load vars for managed-node3 10215 1727204046.89768: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204046.89778: Calling all_plugins_play to load vars for managed-node3 10215 1727204046.89781: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204046.89784: Calling groups_plugins_play to load vars for managed-node3 10215 1727204046.91041: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204046.93314: done with get_vars() 10215 1727204046.93334: done getting variables 10215 1727204046.93384: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.054) 0:00:15.501 ***** 10215 1727204046.93413: entering _queue_task() for managed-node3/fail 10215 1727204046.93621: worker is 1 (out of 1 available) 10215 1727204046.93636: exiting _queue_task() for managed-node3/fail 10215 1727204046.93648: done queuing things up, now waiting for results queue to drain 10215 1727204046.93650: waiting for pending results... 10215 1727204046.93839: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10215 1727204046.93946: in run() - task 12b410aa-8751-3c74-8f8e-000000000029 10215 1727204046.93958: variable 'ansible_search_path' from source: unknown 10215 1727204046.93962: variable 'ansible_search_path' from source: unknown 10215 1727204046.93997: calling self._execute() 10215 1727204046.94066: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204046.94073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204046.94083: variable 'omit' from source: magic vars 10215 1727204046.94383: variable 'ansible_distribution_major_version' from source: facts 10215 1727204046.94395: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204046.94635: variable 'network_state' from source: role '' defaults 10215 1727204046.94639: Evaluated conditional (network_state != {}): False 10215 1727204046.94641: when evaluation is False, skipping this task 10215 1727204046.94643: _execute() done 10215 1727204046.94646: dumping result to json 10215 1727204046.94651: done dumping result, returning 10215 1727204046.94654: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-3c74-8f8e-000000000029] 10215 1727204046.94656: sending task result for task 12b410aa-8751-3c74-8f8e-000000000029 10215 1727204046.94728: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000029 10215 1727204046.94731: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10215 1727204046.94782: no more pending results, returning what we have 10215 1727204046.94785: results queue empty 10215 1727204046.94786: checking for any_errors_fatal 10215 1727204046.94795: done checking for any_errors_fatal 10215 1727204046.94796: checking for max_fail_percentage 10215 1727204046.94798: done checking for max_fail_percentage 10215 1727204046.94798: checking to see if all hosts have failed and the running result is not ok 10215 1727204046.94799: done checking to see if all hosts have failed 10215 1727204046.94800: getting the remaining hosts for this loop 10215 1727204046.94802: done getting the remaining hosts for this loop 10215 1727204046.94805: getting the next task for host managed-node3 10215 1727204046.94811: done getting next task for host managed-node3 10215 1727204046.94814: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10215 1727204046.94818: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204046.94833: getting variables 10215 1727204046.94834: in VariableManager get_vars() 10215 1727204046.94883: Calling all_inventory to load vars for managed-node3 10215 1727204046.94886: Calling groups_inventory to load vars for managed-node3 10215 1727204046.94891: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204046.94901: Calling all_plugins_play to load vars for managed-node3 10215 1727204046.94904: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204046.94910: Calling groups_plugins_play to load vars for managed-node3 10215 1727204046.97130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204046.98712: done with get_vars() 10215 1727204046.98734: done getting variables 10215 1727204046.98785: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.054) 0:00:15.555 ***** 10215 1727204046.98820: entering _queue_task() for managed-node3/fail 10215 1727204046.99201: worker is 1 (out of 1 available) 10215 1727204046.99223: exiting _queue_task() for managed-node3/fail 10215 1727204046.99243: done queuing things up, now waiting for results queue to drain 10215 1727204046.99245: waiting for pending results... 10215 1727204046.99710: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10215 1727204046.99715: in run() - task 12b410aa-8751-3c74-8f8e-00000000002a 10215 1727204046.99719: variable 'ansible_search_path' from source: unknown 10215 1727204046.99722: variable 'ansible_search_path' from source: unknown 10215 1727204046.99768: calling self._execute() 10215 1727204046.99870: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204046.99897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204046.99923: variable 'omit' from source: magic vars 10215 1727204047.00276: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.00287: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204047.00447: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204047.02243: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204047.02299: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204047.02335: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204047.02368: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204047.02394: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204047.02466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.02493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.02519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.02551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.02566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.02651: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.02666: Evaluated conditional (ansible_distribution_major_version | int > 9): True 10215 1727204047.02768: variable 'ansible_distribution' from source: facts 10215 1727204047.02772: variable '__network_rh_distros' from source: role '' defaults 10215 1727204047.02783: Evaluated conditional (ansible_distribution in __network_rh_distros): False 10215 1727204047.02787: when evaluation is False, skipping this task 10215 1727204047.02792: _execute() done 10215 1727204047.02797: dumping result to json 10215 1727204047.02800: done dumping result, returning 10215 1727204047.02810: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-3c74-8f8e-00000000002a] 10215 1727204047.02820: sending task result for task 12b410aa-8751-3c74-8f8e-00000000002a 10215 1727204047.02915: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000002a 10215 1727204047.02919: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 10215 1727204047.02981: no more pending results, returning what we have 10215 1727204047.02986: results queue empty 10215 1727204047.02987: checking for any_errors_fatal 10215 1727204047.02995: done checking for any_errors_fatal 10215 1727204047.02996: checking for max_fail_percentage 10215 1727204047.02997: done checking for max_fail_percentage 10215 1727204047.02998: checking to see if all hosts have failed and the running result is not ok 10215 1727204047.02999: done checking to see if all hosts have failed 10215 1727204047.03000: getting the remaining hosts for this loop 10215 1727204047.03002: done getting the remaining hosts for this loop 10215 1727204047.03007: getting the next task for host managed-node3 10215 1727204047.03015: done getting next task for host managed-node3 10215 1727204047.03019: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10215 1727204047.03022: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204047.03045: getting variables 10215 1727204047.03048: in VariableManager get_vars() 10215 1727204047.03092: Calling all_inventory to load vars for managed-node3 10215 1727204047.03095: Calling groups_inventory to load vars for managed-node3 10215 1727204047.03097: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204047.03108: Calling all_plugins_play to load vars for managed-node3 10215 1727204047.03111: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204047.03114: Calling groups_plugins_play to load vars for managed-node3 10215 1727204047.04346: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204047.05893: done with get_vars() 10215 1727204047.05919: done getting variables 10215 1727204047.06009: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.072) 0:00:15.627 ***** 10215 1727204047.06037: entering _queue_task() for managed-node3/dnf 10215 1727204047.06298: worker is 1 (out of 1 available) 10215 1727204047.06313: exiting _queue_task() for managed-node3/dnf 10215 1727204047.06328: done queuing things up, now waiting for results queue to drain 10215 1727204047.06330: waiting for pending results... 10215 1727204047.06527: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10215 1727204047.06639: in run() - task 12b410aa-8751-3c74-8f8e-00000000002b 10215 1727204047.06652: variable 'ansible_search_path' from source: unknown 10215 1727204047.06655: variable 'ansible_search_path' from source: unknown 10215 1727204047.06695: calling self._execute() 10215 1727204047.06766: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204047.06772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204047.06787: variable 'omit' from source: magic vars 10215 1727204047.07092: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.07109: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204047.07275: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204047.09533: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204047.09538: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204047.09579: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204047.09637: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204047.09687: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204047.09809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.09862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.09915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.10077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.10081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.10177: variable 'ansible_distribution' from source: facts 10215 1727204047.10207: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.10224: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 10215 1727204047.10408: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204047.10604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.10736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.10742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.10759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.10783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.10849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.10898: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.10936: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.11064: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.11068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.11100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.11137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.11184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.11249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.11279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.11536: variable 'network_connections' from source: task vars 10215 1727204047.11595: variable 'controller_profile' from source: play vars 10215 1727204047.11660: variable 'controller_profile' from source: play vars 10215 1727204047.11678: variable 'controller_device' from source: play vars 10215 1727204047.11774: variable 'controller_device' from source: play vars 10215 1727204047.11795: variable 'port1_profile' from source: play vars 10215 1727204047.11891: variable 'port1_profile' from source: play vars 10215 1727204047.11936: variable 'dhcp_interface1' from source: play vars 10215 1727204047.12005: variable 'dhcp_interface1' from source: play vars 10215 1727204047.12021: variable 'controller_profile' from source: play vars 10215 1727204047.12111: variable 'controller_profile' from source: play vars 10215 1727204047.12155: variable 'port2_profile' from source: play vars 10215 1727204047.12218: variable 'port2_profile' from source: play vars 10215 1727204047.12234: variable 'dhcp_interface2' from source: play vars 10215 1727204047.12326: variable 'dhcp_interface2' from source: play vars 10215 1727204047.12397: variable 'controller_profile' from source: play vars 10215 1727204047.12432: variable 'controller_profile' from source: play vars 10215 1727204047.12551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204047.12837: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204047.12904: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204047.12995: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204047.13007: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204047.13082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204047.13120: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204047.13164: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.13395: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204047.13399: variable '__network_team_connections_defined' from source: role '' defaults 10215 1727204047.13656: variable 'network_connections' from source: task vars 10215 1727204047.13669: variable 'controller_profile' from source: play vars 10215 1727204047.13738: variable 'controller_profile' from source: play vars 10215 1727204047.13746: variable 'controller_device' from source: play vars 10215 1727204047.13797: variable 'controller_device' from source: play vars 10215 1727204047.13806: variable 'port1_profile' from source: play vars 10215 1727204047.13860: variable 'port1_profile' from source: play vars 10215 1727204047.13867: variable 'dhcp_interface1' from source: play vars 10215 1727204047.13920: variable 'dhcp_interface1' from source: play vars 10215 1727204047.13927: variable 'controller_profile' from source: play vars 10215 1727204047.13979: variable 'controller_profile' from source: play vars 10215 1727204047.13986: variable 'port2_profile' from source: play vars 10215 1727204047.14041: variable 'port2_profile' from source: play vars 10215 1727204047.14048: variable 'dhcp_interface2' from source: play vars 10215 1727204047.14101: variable 'dhcp_interface2' from source: play vars 10215 1727204047.14108: variable 'controller_profile' from source: play vars 10215 1727204047.14158: variable 'controller_profile' from source: play vars 10215 1727204047.14188: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10215 1727204047.14194: when evaluation is False, skipping this task 10215 1727204047.14196: _execute() done 10215 1727204047.14201: dumping result to json 10215 1727204047.14205: done dumping result, returning 10215 1727204047.14216: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-3c74-8f8e-00000000002b] 10215 1727204047.14222: sending task result for task 12b410aa-8751-3c74-8f8e-00000000002b 10215 1727204047.14317: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000002b 10215 1727204047.14320: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10215 1727204047.14381: no more pending results, returning what we have 10215 1727204047.14385: results queue empty 10215 1727204047.14386: checking for any_errors_fatal 10215 1727204047.14394: done checking for any_errors_fatal 10215 1727204047.14395: checking for max_fail_percentage 10215 1727204047.14397: done checking for max_fail_percentage 10215 1727204047.14398: checking to see if all hosts have failed and the running result is not ok 10215 1727204047.14399: done checking to see if all hosts have failed 10215 1727204047.14400: getting the remaining hosts for this loop 10215 1727204047.14401: done getting the remaining hosts for this loop 10215 1727204047.14407: getting the next task for host managed-node3 10215 1727204047.14414: done getting next task for host managed-node3 10215 1727204047.14418: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10215 1727204047.14421: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204047.14438: getting variables 10215 1727204047.14440: in VariableManager get_vars() 10215 1727204047.14483: Calling all_inventory to load vars for managed-node3 10215 1727204047.14486: Calling groups_inventory to load vars for managed-node3 10215 1727204047.14496: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204047.14507: Calling all_plugins_play to load vars for managed-node3 10215 1727204047.14510: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204047.14514: Calling groups_plugins_play to load vars for managed-node3 10215 1727204047.15830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204047.17383: done with get_vars() 10215 1727204047.17407: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10215 1727204047.17471: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.114) 0:00:15.742 ***** 10215 1727204047.17498: entering _queue_task() for managed-node3/yum 10215 1727204047.17500: Creating lock for yum 10215 1727204047.17754: worker is 1 (out of 1 available) 10215 1727204047.17770: exiting _queue_task() for managed-node3/yum 10215 1727204047.17784: done queuing things up, now waiting for results queue to drain 10215 1727204047.17786: waiting for pending results... 10215 1727204047.17977: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10215 1727204047.18073: in run() - task 12b410aa-8751-3c74-8f8e-00000000002c 10215 1727204047.18087: variable 'ansible_search_path' from source: unknown 10215 1727204047.18092: variable 'ansible_search_path' from source: unknown 10215 1727204047.18131: calling self._execute() 10215 1727204047.18204: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204047.18213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204047.18224: variable 'omit' from source: magic vars 10215 1727204047.18533: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.18544: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204047.18696: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204047.20488: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204047.20545: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204047.20579: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204047.20612: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204047.20636: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204047.20711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.20737: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.20761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.20799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.20814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.20895: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.20909: Evaluated conditional (ansible_distribution_major_version | int < 8): False 10215 1727204047.20916: when evaluation is False, skipping this task 10215 1727204047.20919: _execute() done 10215 1727204047.20924: dumping result to json 10215 1727204047.20928: done dumping result, returning 10215 1727204047.20937: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-3c74-8f8e-00000000002c] 10215 1727204047.20943: sending task result for task 12b410aa-8751-3c74-8f8e-00000000002c 10215 1727204047.21043: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000002c 10215 1727204047.21046: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 10215 1727204047.21132: no more pending results, returning what we have 10215 1727204047.21136: results queue empty 10215 1727204047.21138: checking for any_errors_fatal 10215 1727204047.21145: done checking for any_errors_fatal 10215 1727204047.21146: checking for max_fail_percentage 10215 1727204047.21148: done checking for max_fail_percentage 10215 1727204047.21149: checking to see if all hosts have failed and the running result is not ok 10215 1727204047.21150: done checking to see if all hosts have failed 10215 1727204047.21151: getting the remaining hosts for this loop 10215 1727204047.21152: done getting the remaining hosts for this loop 10215 1727204047.21165: getting the next task for host managed-node3 10215 1727204047.21171: done getting next task for host managed-node3 10215 1727204047.21176: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10215 1727204047.21179: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204047.21198: getting variables 10215 1727204047.21199: in VariableManager get_vars() 10215 1727204047.21239: Calling all_inventory to load vars for managed-node3 10215 1727204047.21242: Calling groups_inventory to load vars for managed-node3 10215 1727204047.21245: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204047.21255: Calling all_plugins_play to load vars for managed-node3 10215 1727204047.21258: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204047.21261: Calling groups_plugins_play to load vars for managed-node3 10215 1727204047.22977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204047.24748: done with get_vars() 10215 1727204047.24770: done getting variables 10215 1727204047.24827: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.073) 0:00:15.815 ***** 10215 1727204047.24855: entering _queue_task() for managed-node3/fail 10215 1727204047.25113: worker is 1 (out of 1 available) 10215 1727204047.25127: exiting _queue_task() for managed-node3/fail 10215 1727204047.25140: done queuing things up, now waiting for results queue to drain 10215 1727204047.25142: waiting for pending results... 10215 1727204047.25535: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10215 1727204047.25542: in run() - task 12b410aa-8751-3c74-8f8e-00000000002d 10215 1727204047.25644: variable 'ansible_search_path' from source: unknown 10215 1727204047.25654: variable 'ansible_search_path' from source: unknown 10215 1727204047.25702: calling self._execute() 10215 1727204047.25800: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204047.25995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204047.26000: variable 'omit' from source: magic vars 10215 1727204047.26224: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.26245: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204047.26392: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204047.26651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204047.29387: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204047.29475: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204047.29534: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204047.29580: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204047.29626: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204047.29732: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.29772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.29818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.29878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.29901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.29975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.30012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.30056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.30113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.30139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.30201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.30238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.30281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.30339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.30369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.30615: variable 'network_connections' from source: task vars 10215 1727204047.30687: variable 'controller_profile' from source: play vars 10215 1727204047.30726: variable 'controller_profile' from source: play vars 10215 1727204047.30742: variable 'controller_device' from source: play vars 10215 1727204047.30830: variable 'controller_device' from source: play vars 10215 1727204047.30846: variable 'port1_profile' from source: play vars 10215 1727204047.30935: variable 'port1_profile' from source: play vars 10215 1727204047.30948: variable 'dhcp_interface1' from source: play vars 10215 1727204047.31038: variable 'dhcp_interface1' from source: play vars 10215 1727204047.31051: variable 'controller_profile' from source: play vars 10215 1727204047.31194: variable 'controller_profile' from source: play vars 10215 1727204047.31198: variable 'port2_profile' from source: play vars 10215 1727204047.31237: variable 'port2_profile' from source: play vars 10215 1727204047.31253: variable 'dhcp_interface2' from source: play vars 10215 1727204047.31343: variable 'dhcp_interface2' from source: play vars 10215 1727204047.31395: variable 'controller_profile' from source: play vars 10215 1727204047.31437: variable 'controller_profile' from source: play vars 10215 1727204047.31540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204047.31753: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204047.31899: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204047.31937: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204047.31972: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204047.32028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204047.32056: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204047.32087: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.32196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204047.32212: variable '__network_team_connections_defined' from source: role '' defaults 10215 1727204047.32613: variable 'network_connections' from source: task vars 10215 1727204047.32616: variable 'controller_profile' from source: play vars 10215 1727204047.32702: variable 'controller_profile' from source: play vars 10215 1727204047.32712: variable 'controller_device' from source: play vars 10215 1727204047.32787: variable 'controller_device' from source: play vars 10215 1727204047.32799: variable 'port1_profile' from source: play vars 10215 1727204047.32882: variable 'port1_profile' from source: play vars 10215 1727204047.32885: variable 'dhcp_interface1' from source: play vars 10215 1727204047.32981: variable 'dhcp_interface1' from source: play vars 10215 1727204047.32987: variable 'controller_profile' from source: play vars 10215 1727204047.33037: variable 'controller_profile' from source: play vars 10215 1727204047.33046: variable 'port2_profile' from source: play vars 10215 1727204047.33194: variable 'port2_profile' from source: play vars 10215 1727204047.33198: variable 'dhcp_interface2' from source: play vars 10215 1727204047.33220: variable 'dhcp_interface2' from source: play vars 10215 1727204047.33228: variable 'controller_profile' from source: play vars 10215 1727204047.33298: variable 'controller_profile' from source: play vars 10215 1727204047.33360: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10215 1727204047.33363: when evaluation is False, skipping this task 10215 1727204047.33366: _execute() done 10215 1727204047.33371: dumping result to json 10215 1727204047.33374: done dumping result, returning 10215 1727204047.33426: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-3c74-8f8e-00000000002d] 10215 1727204047.33430: sending task result for task 12b410aa-8751-3c74-8f8e-00000000002d 10215 1727204047.33499: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000002d 10215 1727204047.33502: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10215 1727204047.33585: no more pending results, returning what we have 10215 1727204047.33591: results queue empty 10215 1727204047.33592: checking for any_errors_fatal 10215 1727204047.33598: done checking for any_errors_fatal 10215 1727204047.33599: checking for max_fail_percentage 10215 1727204047.33600: done checking for max_fail_percentage 10215 1727204047.33602: checking to see if all hosts have failed and the running result is not ok 10215 1727204047.33603: done checking to see if all hosts have failed 10215 1727204047.33604: getting the remaining hosts for this loop 10215 1727204047.33605: done getting the remaining hosts for this loop 10215 1727204047.33612: getting the next task for host managed-node3 10215 1727204047.33618: done getting next task for host managed-node3 10215 1727204047.33623: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 10215 1727204047.33626: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204047.33643: getting variables 10215 1727204047.33645: in VariableManager get_vars() 10215 1727204047.33685: Calling all_inventory to load vars for managed-node3 10215 1727204047.33688: Calling groups_inventory to load vars for managed-node3 10215 1727204047.33792: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204047.33811: Calling all_plugins_play to load vars for managed-node3 10215 1727204047.33816: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204047.33820: Calling groups_plugins_play to load vars for managed-node3 10215 1727204047.36349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204047.39297: done with get_vars() 10215 1727204047.39331: done getting variables 10215 1727204047.39401: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.145) 0:00:15.961 ***** 10215 1727204047.39439: entering _queue_task() for managed-node3/package 10215 1727204047.39757: worker is 1 (out of 1 available) 10215 1727204047.39770: exiting _queue_task() for managed-node3/package 10215 1727204047.39783: done queuing things up, now waiting for results queue to drain 10215 1727204047.39785: waiting for pending results... 10215 1727204047.40217: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 10215 1727204047.40222: in run() - task 12b410aa-8751-3c74-8f8e-00000000002e 10215 1727204047.40228: variable 'ansible_search_path' from source: unknown 10215 1727204047.40237: variable 'ansible_search_path' from source: unknown 10215 1727204047.40309: calling self._execute() 10215 1727204047.40383: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204047.40399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204047.40422: variable 'omit' from source: magic vars 10215 1727204047.40854: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.40858: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204047.41121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204047.41520: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204047.41579: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204047.41695: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204047.41699: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204047.41825: variable 'network_packages' from source: role '' defaults 10215 1727204047.41970: variable '__network_provider_setup' from source: role '' defaults 10215 1727204047.42007: variable '__network_service_name_default_nm' from source: role '' defaults 10215 1727204047.42093: variable '__network_service_name_default_nm' from source: role '' defaults 10215 1727204047.42169: variable '__network_packages_default_nm' from source: role '' defaults 10215 1727204047.42230: variable '__network_packages_default_nm' from source: role '' defaults 10215 1727204047.42573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204047.50161: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204047.50279: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204047.50329: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204047.50378: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204047.50568: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204047.50619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.50657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.50695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.50749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.50773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.50838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.50870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.50908: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.50956: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.50979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.51284: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10215 1727204047.51449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.51558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.51561: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.51571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.51594: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.51712: variable 'ansible_python' from source: facts 10215 1727204047.51745: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10215 1727204047.51855: variable '__network_wpa_supplicant_required' from source: role '' defaults 10215 1727204047.51965: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10215 1727204047.52141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.52175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.52217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.52271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.52294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.52361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.52423: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.52445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.52532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.52535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.52709: variable 'network_connections' from source: task vars 10215 1727204047.52721: variable 'controller_profile' from source: play vars 10215 1727204047.52845: variable 'controller_profile' from source: play vars 10215 1727204047.52866: variable 'controller_device' from source: play vars 10215 1727204047.53076: variable 'controller_device' from source: play vars 10215 1727204047.53079: variable 'port1_profile' from source: play vars 10215 1727204047.53141: variable 'port1_profile' from source: play vars 10215 1727204047.53157: variable 'dhcp_interface1' from source: play vars 10215 1727204047.53294: variable 'dhcp_interface1' from source: play vars 10215 1727204047.53314: variable 'controller_profile' from source: play vars 10215 1727204047.53438: variable 'controller_profile' from source: play vars 10215 1727204047.53454: variable 'port2_profile' from source: play vars 10215 1727204047.53577: variable 'port2_profile' from source: play vars 10215 1727204047.53598: variable 'dhcp_interface2' from source: play vars 10215 1727204047.53741: variable 'dhcp_interface2' from source: play vars 10215 1727204047.53744: variable 'controller_profile' from source: play vars 10215 1727204047.53865: variable 'controller_profile' from source: play vars 10215 1727204047.53946: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204047.53999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204047.54066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.54079: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204047.54131: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204047.54531: variable 'network_connections' from source: task vars 10215 1727204047.54542: variable 'controller_profile' from source: play vars 10215 1727204047.54664: variable 'controller_profile' from source: play vars 10215 1727204047.54715: variable 'controller_device' from source: play vars 10215 1727204047.54807: variable 'controller_device' from source: play vars 10215 1727204047.54829: variable 'port1_profile' from source: play vars 10215 1727204047.54960: variable 'port1_profile' from source: play vars 10215 1727204047.54976: variable 'dhcp_interface1' from source: play vars 10215 1727204047.55099: variable 'dhcp_interface1' from source: play vars 10215 1727204047.55149: variable 'controller_profile' from source: play vars 10215 1727204047.55238: variable 'controller_profile' from source: play vars 10215 1727204047.55257: variable 'port2_profile' from source: play vars 10215 1727204047.55382: variable 'port2_profile' from source: play vars 10215 1727204047.55400: variable 'dhcp_interface2' from source: play vars 10215 1727204047.55700: variable 'dhcp_interface2' from source: play vars 10215 1727204047.55703: variable 'controller_profile' from source: play vars 10215 1727204047.55706: variable 'controller_profile' from source: play vars 10215 1727204047.55729: variable '__network_packages_default_wireless' from source: role '' defaults 10215 1727204047.55834: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204047.56247: variable 'network_connections' from source: task vars 10215 1727204047.56262: variable 'controller_profile' from source: play vars 10215 1727204047.56345: variable 'controller_profile' from source: play vars 10215 1727204047.56361: variable 'controller_device' from source: play vars 10215 1727204047.56448: variable 'controller_device' from source: play vars 10215 1727204047.56477: variable 'port1_profile' from source: play vars 10215 1727204047.56587: variable 'port1_profile' from source: play vars 10215 1727204047.56590: variable 'dhcp_interface1' from source: play vars 10215 1727204047.56650: variable 'dhcp_interface1' from source: play vars 10215 1727204047.56663: variable 'controller_profile' from source: play vars 10215 1727204047.56746: variable 'controller_profile' from source: play vars 10215 1727204047.56760: variable 'port2_profile' from source: play vars 10215 1727204047.56916: variable 'port2_profile' from source: play vars 10215 1727204047.56920: variable 'dhcp_interface2' from source: play vars 10215 1727204047.56934: variable 'dhcp_interface2' from source: play vars 10215 1727204047.56947: variable 'controller_profile' from source: play vars 10215 1727204047.57025: variable 'controller_profile' from source: play vars 10215 1727204047.57056: variable '__network_packages_default_team' from source: role '' defaults 10215 1727204047.57150: variable '__network_team_connections_defined' from source: role '' defaults 10215 1727204047.57557: variable 'network_connections' from source: task vars 10215 1727204047.57575: variable 'controller_profile' from source: play vars 10215 1727204047.57658: variable 'controller_profile' from source: play vars 10215 1727204047.57701: variable 'controller_device' from source: play vars 10215 1727204047.57822: variable 'controller_device' from source: play vars 10215 1727204047.57839: variable 'port1_profile' from source: play vars 10215 1727204047.57924: variable 'port1_profile' from source: play vars 10215 1727204047.57958: variable 'dhcp_interface1' from source: play vars 10215 1727204047.58045: variable 'dhcp_interface1' from source: play vars 10215 1727204047.58059: variable 'controller_profile' from source: play vars 10215 1727204047.58146: variable 'controller_profile' from source: play vars 10215 1727204047.58160: variable 'port2_profile' from source: play vars 10215 1727204047.58246: variable 'port2_profile' from source: play vars 10215 1727204047.58296: variable 'dhcp_interface2' from source: play vars 10215 1727204047.58350: variable 'dhcp_interface2' from source: play vars 10215 1727204047.58363: variable 'controller_profile' from source: play vars 10215 1727204047.58462: variable 'controller_profile' from source: play vars 10215 1727204047.58548: variable '__network_service_name_default_initscripts' from source: role '' defaults 10215 1727204047.58759: variable '__network_service_name_default_initscripts' from source: role '' defaults 10215 1727204047.58762: variable '__network_packages_default_initscripts' from source: role '' defaults 10215 1727204047.58839: variable '__network_packages_default_initscripts' from source: role '' defaults 10215 1727204047.59145: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10215 1727204047.59807: variable 'network_connections' from source: task vars 10215 1727204047.59819: variable 'controller_profile' from source: play vars 10215 1727204047.59903: variable 'controller_profile' from source: play vars 10215 1727204047.59922: variable 'controller_device' from source: play vars 10215 1727204047.60091: variable 'controller_device' from source: play vars 10215 1727204047.60094: variable 'port1_profile' from source: play vars 10215 1727204047.60101: variable 'port1_profile' from source: play vars 10215 1727204047.60107: variable 'dhcp_interface1' from source: play vars 10215 1727204047.60182: variable 'dhcp_interface1' from source: play vars 10215 1727204047.60200: variable 'controller_profile' from source: play vars 10215 1727204047.60273: variable 'controller_profile' from source: play vars 10215 1727204047.60287: variable 'port2_profile' from source: play vars 10215 1727204047.60368: variable 'port2_profile' from source: play vars 10215 1727204047.60380: variable 'dhcp_interface2' from source: play vars 10215 1727204047.60460: variable 'dhcp_interface2' from source: play vars 10215 1727204047.60473: variable 'controller_profile' from source: play vars 10215 1727204047.60554: variable 'controller_profile' from source: play vars 10215 1727204047.60568: variable 'ansible_distribution' from source: facts 10215 1727204047.60577: variable '__network_rh_distros' from source: role '' defaults 10215 1727204047.60587: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.60623: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10215 1727204047.60854: variable 'ansible_distribution' from source: facts 10215 1727204047.60863: variable '__network_rh_distros' from source: role '' defaults 10215 1727204047.60956: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.60960: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10215 1727204047.61113: variable 'ansible_distribution' from source: facts 10215 1727204047.61124: variable '__network_rh_distros' from source: role '' defaults 10215 1727204047.61134: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.61184: variable 'network_provider' from source: set_fact 10215 1727204047.61212: variable 'ansible_facts' from source: unknown 10215 1727204047.62281: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 10215 1727204047.62292: when evaluation is False, skipping this task 10215 1727204047.62300: _execute() done 10215 1727204047.62378: dumping result to json 10215 1727204047.62381: done dumping result, returning 10215 1727204047.62384: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-3c74-8f8e-00000000002e] 10215 1727204047.62386: sending task result for task 12b410aa-8751-3c74-8f8e-00000000002e 10215 1727204047.62459: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000002e 10215 1727204047.62462: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 10215 1727204047.62534: no more pending results, returning what we have 10215 1727204047.62537: results queue empty 10215 1727204047.62538: checking for any_errors_fatal 10215 1727204047.62545: done checking for any_errors_fatal 10215 1727204047.62546: checking for max_fail_percentage 10215 1727204047.62548: done checking for max_fail_percentage 10215 1727204047.62550: checking to see if all hosts have failed and the running result is not ok 10215 1727204047.62551: done checking to see if all hosts have failed 10215 1727204047.62552: getting the remaining hosts for this loop 10215 1727204047.62553: done getting the remaining hosts for this loop 10215 1727204047.62558: getting the next task for host managed-node3 10215 1727204047.62565: done getting next task for host managed-node3 10215 1727204047.62570: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10215 1727204047.62574: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204047.62591: getting variables 10215 1727204047.62593: in VariableManager get_vars() 10215 1727204047.62638: Calling all_inventory to load vars for managed-node3 10215 1727204047.62641: Calling groups_inventory to load vars for managed-node3 10215 1727204047.62644: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204047.62655: Calling all_plugins_play to load vars for managed-node3 10215 1727204047.62659: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204047.62663: Calling groups_plugins_play to load vars for managed-node3 10215 1727204047.69335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204047.72420: done with get_vars() 10215 1727204047.72464: done getting variables 10215 1727204047.72522: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.331) 0:00:16.292 ***** 10215 1727204047.72559: entering _queue_task() for managed-node3/package 10215 1727204047.73044: worker is 1 (out of 1 available) 10215 1727204047.73064: exiting _queue_task() for managed-node3/package 10215 1727204047.73078: done queuing things up, now waiting for results queue to drain 10215 1727204047.73080: waiting for pending results... 10215 1727204047.73411: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10215 1727204047.73575: in run() - task 12b410aa-8751-3c74-8f8e-00000000002f 10215 1727204047.73593: variable 'ansible_search_path' from source: unknown 10215 1727204047.73597: variable 'ansible_search_path' from source: unknown 10215 1727204047.73696: calling self._execute() 10215 1727204047.73760: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204047.73772: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204047.73786: variable 'omit' from source: magic vars 10215 1727204047.74251: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.74264: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204047.74436: variable 'network_state' from source: role '' defaults 10215 1727204047.74481: Evaluated conditional (network_state != {}): False 10215 1727204047.74485: when evaluation is False, skipping this task 10215 1727204047.74488: _execute() done 10215 1727204047.74492: dumping result to json 10215 1727204047.74495: done dumping result, returning 10215 1727204047.74498: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-3c74-8f8e-00000000002f] 10215 1727204047.74501: sending task result for task 12b410aa-8751-3c74-8f8e-00000000002f skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10215 1727204047.74861: no more pending results, returning what we have 10215 1727204047.74865: results queue empty 10215 1727204047.74866: checking for any_errors_fatal 10215 1727204047.74872: done checking for any_errors_fatal 10215 1727204047.74873: checking for max_fail_percentage 10215 1727204047.74876: done checking for max_fail_percentage 10215 1727204047.74877: checking to see if all hosts have failed and the running result is not ok 10215 1727204047.74878: done checking to see if all hosts have failed 10215 1727204047.74879: getting the remaining hosts for this loop 10215 1727204047.74881: done getting the remaining hosts for this loop 10215 1727204047.74885: getting the next task for host managed-node3 10215 1727204047.74893: done getting next task for host managed-node3 10215 1727204047.74898: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10215 1727204047.74901: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204047.74917: getting variables 10215 1727204047.74919: in VariableManager get_vars() 10215 1727204047.74964: Calling all_inventory to load vars for managed-node3 10215 1727204047.74968: Calling groups_inventory to load vars for managed-node3 10215 1727204047.74971: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204047.74983: Calling all_plugins_play to load vars for managed-node3 10215 1727204047.74986: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204047.75048: Calling groups_plugins_play to load vars for managed-node3 10215 1727204047.75061: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000002f 10215 1727204047.75064: WORKER PROCESS EXITING 10215 1727204047.77530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204047.80532: done with get_vars() 10215 1727204047.80566: done getting variables 10215 1727204047.80647: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.081) 0:00:16.374 ***** 10215 1727204047.80687: entering _queue_task() for managed-node3/package 10215 1727204047.81039: worker is 1 (out of 1 available) 10215 1727204047.81054: exiting _queue_task() for managed-node3/package 10215 1727204047.81067: done queuing things up, now waiting for results queue to drain 10215 1727204047.81070: waiting for pending results... 10215 1727204047.81378: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10215 1727204047.81598: in run() - task 12b410aa-8751-3c74-8f8e-000000000030 10215 1727204047.81603: variable 'ansible_search_path' from source: unknown 10215 1727204047.81605: variable 'ansible_search_path' from source: unknown 10215 1727204047.81643: calling self._execute() 10215 1727204047.81755: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204047.81771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204047.81792: variable 'omit' from source: magic vars 10215 1727204047.82245: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.82266: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204047.82435: variable 'network_state' from source: role '' defaults 10215 1727204047.82454: Evaluated conditional (network_state != {}): False 10215 1727204047.82471: when evaluation is False, skipping this task 10215 1727204047.82479: _execute() done 10215 1727204047.82488: dumping result to json 10215 1727204047.82499: done dumping result, returning 10215 1727204047.82511: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-3c74-8f8e-000000000030] 10215 1727204047.82524: sending task result for task 12b410aa-8751-3c74-8f8e-000000000030 10215 1727204047.82797: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000030 10215 1727204047.82801: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10215 1727204047.82850: no more pending results, returning what we have 10215 1727204047.82854: results queue empty 10215 1727204047.82856: checking for any_errors_fatal 10215 1727204047.82864: done checking for any_errors_fatal 10215 1727204047.82865: checking for max_fail_percentage 10215 1727204047.82867: done checking for max_fail_percentage 10215 1727204047.82868: checking to see if all hosts have failed and the running result is not ok 10215 1727204047.82869: done checking to see if all hosts have failed 10215 1727204047.82870: getting the remaining hosts for this loop 10215 1727204047.82872: done getting the remaining hosts for this loop 10215 1727204047.82877: getting the next task for host managed-node3 10215 1727204047.82884: done getting next task for host managed-node3 10215 1727204047.82887: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10215 1727204047.82892: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204047.82909: getting variables 10215 1727204047.82911: in VariableManager get_vars() 10215 1727204047.82955: Calling all_inventory to load vars for managed-node3 10215 1727204047.82958: Calling groups_inventory to load vars for managed-node3 10215 1727204047.82961: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204047.82975: Calling all_plugins_play to load vars for managed-node3 10215 1727204047.82978: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204047.82982: Calling groups_plugins_play to load vars for managed-node3 10215 1727204047.85167: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204047.88830: done with get_vars() 10215 1727204047.88867: done getting variables 10215 1727204047.88982: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.083) 0:00:16.457 ***** 10215 1727204047.89023: entering _queue_task() for managed-node3/service 10215 1727204047.89025: Creating lock for service 10215 1727204047.89354: worker is 1 (out of 1 available) 10215 1727204047.89368: exiting _queue_task() for managed-node3/service 10215 1727204047.89381: done queuing things up, now waiting for results queue to drain 10215 1727204047.89383: waiting for pending results... 10215 1727204047.89670: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10215 1727204047.89995: in run() - task 12b410aa-8751-3c74-8f8e-000000000031 10215 1727204047.90000: variable 'ansible_search_path' from source: unknown 10215 1727204047.90003: variable 'ansible_search_path' from source: unknown 10215 1727204047.90006: calling self._execute() 10215 1727204047.90009: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204047.90021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204047.90040: variable 'omit' from source: magic vars 10215 1727204047.90476: variable 'ansible_distribution_major_version' from source: facts 10215 1727204047.90498: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204047.90655: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204047.90927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204047.93941: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204047.94030: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204047.94093: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204047.94145: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204047.94181: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204047.94281: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.94326: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.94370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.94429: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.94457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.94521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.94573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.94582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.94633: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.94652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.94790: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204047.94793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204047.94798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.94833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204047.94855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204047.95083: variable 'network_connections' from source: task vars 10215 1727204047.95107: variable 'controller_profile' from source: play vars 10215 1727204047.95200: variable 'controller_profile' from source: play vars 10215 1727204047.95217: variable 'controller_device' from source: play vars 10215 1727204047.95301: variable 'controller_device' from source: play vars 10215 1727204047.95320: variable 'port1_profile' from source: play vars 10215 1727204047.95403: variable 'port1_profile' from source: play vars 10215 1727204047.95459: variable 'dhcp_interface1' from source: play vars 10215 1727204047.95501: variable 'dhcp_interface1' from source: play vars 10215 1727204047.95515: variable 'controller_profile' from source: play vars 10215 1727204047.95599: variable 'controller_profile' from source: play vars 10215 1727204047.95614: variable 'port2_profile' from source: play vars 10215 1727204047.95697: variable 'port2_profile' from source: play vars 10215 1727204047.95783: variable 'dhcp_interface2' from source: play vars 10215 1727204047.95792: variable 'dhcp_interface2' from source: play vars 10215 1727204047.95806: variable 'controller_profile' from source: play vars 10215 1727204047.95882: variable 'controller_profile' from source: play vars 10215 1727204047.95978: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204047.96191: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204047.96256: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204047.96301: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204047.96346: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204047.96441: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204047.96445: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204047.96474: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204047.96514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204047.96605: variable '__network_team_connections_defined' from source: role '' defaults 10215 1727204047.96986: variable 'network_connections' from source: task vars 10215 1727204047.96992: variable 'controller_profile' from source: play vars 10215 1727204047.97033: variable 'controller_profile' from source: play vars 10215 1727204047.97047: variable 'controller_device' from source: play vars 10215 1727204047.97126: variable 'controller_device' from source: play vars 10215 1727204047.97144: variable 'port1_profile' from source: play vars 10215 1727204047.97225: variable 'port1_profile' from source: play vars 10215 1727204047.97238: variable 'dhcp_interface1' from source: play vars 10215 1727204047.97316: variable 'dhcp_interface1' from source: play vars 10215 1727204047.97396: variable 'controller_profile' from source: play vars 10215 1727204047.97408: variable 'controller_profile' from source: play vars 10215 1727204047.97425: variable 'port2_profile' from source: play vars 10215 1727204047.97498: variable 'port2_profile' from source: play vars 10215 1727204047.97512: variable 'dhcp_interface2' from source: play vars 10215 1727204047.97594: variable 'dhcp_interface2' from source: play vars 10215 1727204047.97605: variable 'controller_profile' from source: play vars 10215 1727204047.97686: variable 'controller_profile' from source: play vars 10215 1727204047.97733: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10215 1727204047.97742: when evaluation is False, skipping this task 10215 1727204047.97755: _execute() done 10215 1727204047.97895: dumping result to json 10215 1727204047.97898: done dumping result, returning 10215 1727204047.97901: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-3c74-8f8e-000000000031] 10215 1727204047.97903: sending task result for task 12b410aa-8751-3c74-8f8e-000000000031 10215 1727204047.97977: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000031 10215 1727204047.97980: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10215 1727204047.98034: no more pending results, returning what we have 10215 1727204047.98039: results queue empty 10215 1727204047.98040: checking for any_errors_fatal 10215 1727204047.98048: done checking for any_errors_fatal 10215 1727204047.98049: checking for max_fail_percentage 10215 1727204047.98051: done checking for max_fail_percentage 10215 1727204047.98052: checking to see if all hosts have failed and the running result is not ok 10215 1727204047.98053: done checking to see if all hosts have failed 10215 1727204047.98054: getting the remaining hosts for this loop 10215 1727204047.98055: done getting the remaining hosts for this loop 10215 1727204047.98060: getting the next task for host managed-node3 10215 1727204047.98067: done getting next task for host managed-node3 10215 1727204047.98071: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10215 1727204047.98075: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204047.98091: getting variables 10215 1727204047.98093: in VariableManager get_vars() 10215 1727204047.98142: Calling all_inventory to load vars for managed-node3 10215 1727204047.98145: Calling groups_inventory to load vars for managed-node3 10215 1727204047.98149: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204047.98161: Calling all_plugins_play to load vars for managed-node3 10215 1727204047.98165: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204047.98169: Calling groups_plugins_play to load vars for managed-node3 10215 1727204048.00732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204048.03682: done with get_vars() 10215 1727204048.03724: done getting variables 10215 1727204048.03795: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.148) 0:00:16.605 ***** 10215 1727204048.03831: entering _queue_task() for managed-node3/service 10215 1727204048.04174: worker is 1 (out of 1 available) 10215 1727204048.04192: exiting _queue_task() for managed-node3/service 10215 1727204048.04207: done queuing things up, now waiting for results queue to drain 10215 1727204048.04209: waiting for pending results... 10215 1727204048.04612: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10215 1727204048.04711: in run() - task 12b410aa-8751-3c74-8f8e-000000000032 10215 1727204048.04735: variable 'ansible_search_path' from source: unknown 10215 1727204048.04747: variable 'ansible_search_path' from source: unknown 10215 1727204048.04797: calling self._execute() 10215 1727204048.04910: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204048.04931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204048.04948: variable 'omit' from source: magic vars 10215 1727204048.05412: variable 'ansible_distribution_major_version' from source: facts 10215 1727204048.05433: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204048.05657: variable 'network_provider' from source: set_fact 10215 1727204048.05668: variable 'network_state' from source: role '' defaults 10215 1727204048.05690: Evaluated conditional (network_provider == "nm" or network_state != {}): True 10215 1727204048.05797: variable 'omit' from source: magic vars 10215 1727204048.05800: variable 'omit' from source: magic vars 10215 1727204048.05826: variable 'network_service_name' from source: role '' defaults 10215 1727204048.05918: variable 'network_service_name' from source: role '' defaults 10215 1727204048.06039: variable '__network_provider_setup' from source: role '' defaults 10215 1727204048.06051: variable '__network_service_name_default_nm' from source: role '' defaults 10215 1727204048.06132: variable '__network_service_name_default_nm' from source: role '' defaults 10215 1727204048.06146: variable '__network_packages_default_nm' from source: role '' defaults 10215 1727204048.06227: variable '__network_packages_default_nm' from source: role '' defaults 10215 1727204048.06522: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204048.09096: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204048.09185: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204048.09277: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204048.09286: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204048.09325: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204048.09427: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204048.09467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204048.09511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204048.09567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204048.09606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204048.09656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204048.09691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204048.09794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204048.09797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204048.09800: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204048.10116: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10215 1727204048.10282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204048.10318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204048.10352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204048.10413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204048.10434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204048.10553: variable 'ansible_python' from source: facts 10215 1727204048.10586: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10215 1727204048.10694: variable '__network_wpa_supplicant_required' from source: role '' defaults 10215 1727204048.10801: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10215 1727204048.10971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204048.11008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204048.11043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204048.11103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204048.11189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204048.11192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204048.11236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204048.11270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204048.11329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204048.11350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204048.11536: variable 'network_connections' from source: task vars 10215 1727204048.11553: variable 'controller_profile' from source: play vars 10215 1727204048.11648: variable 'controller_profile' from source: play vars 10215 1727204048.11667: variable 'controller_device' from source: play vars 10215 1727204048.11764: variable 'controller_device' from source: play vars 10215 1727204048.11897: variable 'port1_profile' from source: play vars 10215 1727204048.11900: variable 'port1_profile' from source: play vars 10215 1727204048.11902: variable 'dhcp_interface1' from source: play vars 10215 1727204048.11985: variable 'dhcp_interface1' from source: play vars 10215 1727204048.12007: variable 'controller_profile' from source: play vars 10215 1727204048.12103: variable 'controller_profile' from source: play vars 10215 1727204048.12127: variable 'port2_profile' from source: play vars 10215 1727204048.12218: variable 'port2_profile' from source: play vars 10215 1727204048.12242: variable 'dhcp_interface2' from source: play vars 10215 1727204048.12334: variable 'dhcp_interface2' from source: play vars 10215 1727204048.12358: variable 'controller_profile' from source: play vars 10215 1727204048.12455: variable 'controller_profile' from source: play vars 10215 1727204048.12594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204048.12891: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204048.12919: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204048.12975: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204048.13034: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204048.13117: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204048.13158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204048.13205: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204048.13255: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204048.13394: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204048.13715: variable 'network_connections' from source: task vars 10215 1727204048.13727: variable 'controller_profile' from source: play vars 10215 1727204048.13824: variable 'controller_profile' from source: play vars 10215 1727204048.13840: variable 'controller_device' from source: play vars 10215 1727204048.14147: variable 'controller_device' from source: play vars 10215 1727204048.14151: variable 'port1_profile' from source: play vars 10215 1727204048.14221: variable 'port1_profile' from source: play vars 10215 1727204048.14238: variable 'dhcp_interface1' from source: play vars 10215 1727204048.14337: variable 'dhcp_interface1' from source: play vars 10215 1727204048.14355: variable 'controller_profile' from source: play vars 10215 1727204048.14450: variable 'controller_profile' from source: play vars 10215 1727204048.14467: variable 'port2_profile' from source: play vars 10215 1727204048.14562: variable 'port2_profile' from source: play vars 10215 1727204048.14579: variable 'dhcp_interface2' from source: play vars 10215 1727204048.14683: variable 'dhcp_interface2' from source: play vars 10215 1727204048.14712: variable 'controller_profile' from source: play vars 10215 1727204048.14807: variable 'controller_profile' from source: play vars 10215 1727204048.14877: variable '__network_packages_default_wireless' from source: role '' defaults 10215 1727204048.14987: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204048.15406: variable 'network_connections' from source: task vars 10215 1727204048.15469: variable 'controller_profile' from source: play vars 10215 1727204048.15519: variable 'controller_profile' from source: play vars 10215 1727204048.15532: variable 'controller_device' from source: play vars 10215 1727204048.15623: variable 'controller_device' from source: play vars 10215 1727204048.15641: variable 'port1_profile' from source: play vars 10215 1727204048.15737: variable 'port1_profile' from source: play vars 10215 1727204048.15750: variable 'dhcp_interface1' from source: play vars 10215 1727204048.15841: variable 'dhcp_interface1' from source: play vars 10215 1727204048.15856: variable 'controller_profile' from source: play vars 10215 1727204048.16096: variable 'controller_profile' from source: play vars 10215 1727204048.16099: variable 'port2_profile' from source: play vars 10215 1727204048.16102: variable 'port2_profile' from source: play vars 10215 1727204048.16104: variable 'dhcp_interface2' from source: play vars 10215 1727204048.16154: variable 'dhcp_interface2' from source: play vars 10215 1727204048.16169: variable 'controller_profile' from source: play vars 10215 1727204048.16266: variable 'controller_profile' from source: play vars 10215 1727204048.16316: variable '__network_packages_default_team' from source: role '' defaults 10215 1727204048.16424: variable '__network_team_connections_defined' from source: role '' defaults 10215 1727204048.16862: variable 'network_connections' from source: task vars 10215 1727204048.16873: variable 'controller_profile' from source: play vars 10215 1727204048.17001: variable 'controller_profile' from source: play vars 10215 1727204048.17021: variable 'controller_device' from source: play vars 10215 1727204048.17278: variable 'controller_device' from source: play vars 10215 1727204048.17297: variable 'port1_profile' from source: play vars 10215 1727204048.17387: variable 'port1_profile' from source: play vars 10215 1727204048.17405: variable 'dhcp_interface1' from source: play vars 10215 1727204048.17497: variable 'dhcp_interface1' from source: play vars 10215 1727204048.17511: variable 'controller_profile' from source: play vars 10215 1727204048.17607: variable 'controller_profile' from source: play vars 10215 1727204048.17623: variable 'port2_profile' from source: play vars 10215 1727204048.17714: variable 'port2_profile' from source: play vars 10215 1727204048.17797: variable 'dhcp_interface2' from source: play vars 10215 1727204048.17824: variable 'dhcp_interface2' from source: play vars 10215 1727204048.17837: variable 'controller_profile' from source: play vars 10215 1727204048.17929: variable 'controller_profile' from source: play vars 10215 1727204048.18020: variable '__network_service_name_default_initscripts' from source: role '' defaults 10215 1727204048.18109: variable '__network_service_name_default_initscripts' from source: role '' defaults 10215 1727204048.18122: variable '__network_packages_default_initscripts' from source: role '' defaults 10215 1727204048.18205: variable '__network_packages_default_initscripts' from source: role '' defaults 10215 1727204048.18577: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10215 1727204048.19210: variable 'network_connections' from source: task vars 10215 1727204048.19224: variable 'controller_profile' from source: play vars 10215 1727204048.19314: variable 'controller_profile' from source: play vars 10215 1727204048.19327: variable 'controller_device' from source: play vars 10215 1727204048.19407: variable 'controller_device' from source: play vars 10215 1727204048.19426: variable 'port1_profile' from source: play vars 10215 1727204048.19511: variable 'port1_profile' from source: play vars 10215 1727204048.19524: variable 'dhcp_interface1' from source: play vars 10215 1727204048.19697: variable 'dhcp_interface1' from source: play vars 10215 1727204048.19700: variable 'controller_profile' from source: play vars 10215 1727204048.19703: variable 'controller_profile' from source: play vars 10215 1727204048.19705: variable 'port2_profile' from source: play vars 10215 1727204048.19777: variable 'port2_profile' from source: play vars 10215 1727204048.19825: variable 'dhcp_interface2' from source: play vars 10215 1727204048.19907: variable 'dhcp_interface2' from source: play vars 10215 1727204048.19921: variable 'controller_profile' from source: play vars 10215 1727204048.20121: variable 'controller_profile' from source: play vars 10215 1727204048.20138: variable 'ansible_distribution' from source: facts 10215 1727204048.20148: variable '__network_rh_distros' from source: role '' defaults 10215 1727204048.20160: variable 'ansible_distribution_major_version' from source: facts 10215 1727204048.20230: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10215 1727204048.20724: variable 'ansible_distribution' from source: facts 10215 1727204048.20794: variable '__network_rh_distros' from source: role '' defaults 10215 1727204048.20798: variable 'ansible_distribution_major_version' from source: facts 10215 1727204048.20800: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10215 1727204048.21385: variable 'ansible_distribution' from source: facts 10215 1727204048.21388: variable '__network_rh_distros' from source: role '' defaults 10215 1727204048.21393: variable 'ansible_distribution_major_version' from source: facts 10215 1727204048.21395: variable 'network_provider' from source: set_fact 10215 1727204048.21398: variable 'omit' from source: magic vars 10215 1727204048.21517: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204048.21556: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204048.21630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204048.21675: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204048.21731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204048.22044: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204048.22047: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204048.22049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204048.22286: Set connection var ansible_connection to ssh 10215 1727204048.22304: Set connection var ansible_pipelining to False 10215 1727204048.22317: Set connection var ansible_shell_type to sh 10215 1727204048.22329: Set connection var ansible_timeout to 10 10215 1727204048.22341: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204048.22385: Set connection var ansible_shell_executable to /bin/sh 10215 1727204048.22512: variable 'ansible_shell_executable' from source: unknown 10215 1727204048.22523: variable 'ansible_connection' from source: unknown 10215 1727204048.22530: variable 'ansible_module_compression' from source: unknown 10215 1727204048.22538: variable 'ansible_shell_type' from source: unknown 10215 1727204048.22546: variable 'ansible_shell_executable' from source: unknown 10215 1727204048.22555: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204048.22564: variable 'ansible_pipelining' from source: unknown 10215 1727204048.22572: variable 'ansible_timeout' from source: unknown 10215 1727204048.22582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204048.22827: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204048.22850: variable 'omit' from source: magic vars 10215 1727204048.23096: starting attempt loop 10215 1727204048.23099: running the handler 10215 1727204048.23101: variable 'ansible_facts' from source: unknown 10215 1727204048.24734: _low_level_execute_command(): starting 10215 1727204048.24755: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204048.25463: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204048.25482: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204048.25501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204048.25527: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204048.25609: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204048.25650: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204048.25669: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204048.25695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204048.25765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204048.27522: stdout chunk (state=3): >>>/root <<< 10215 1727204048.27793: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204048.27896: stdout chunk (state=3): >>><<< 10215 1727204048.27900: stderr chunk (state=3): >>><<< 10215 1727204048.27904: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204048.27910: _low_level_execute_command(): starting 10215 1727204048.27914: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494 `" && echo ansible-tmp-1727204048.2783945-11323-253608633595494="` echo /root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494 `" ) && sleep 0' 10215 1727204048.29236: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204048.29250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204048.29306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204048.29412: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204048.29425: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204048.29488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204048.29760: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204048.31752: stdout chunk (state=3): >>>ansible-tmp-1727204048.2783945-11323-253608633595494=/root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494 <<< 10215 1727204048.31863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204048.32121: stderr chunk (state=3): >>><<< 10215 1727204048.32124: stdout chunk (state=3): >>><<< 10215 1727204048.32143: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204048.2783945-11323-253608633595494=/root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204048.32186: variable 'ansible_module_compression' from source: unknown 10215 1727204048.32255: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 10215 1727204048.32319: ANSIBALLZ: Acquiring lock 10215 1727204048.32323: ANSIBALLZ: Lock acquired: 139878728192448 10215 1727204048.32331: ANSIBALLZ: Creating module 10215 1727204048.63869: ANSIBALLZ: Writing module into payload 10215 1727204048.64016: ANSIBALLZ: Writing module 10215 1727204048.64041: ANSIBALLZ: Renaming module 10215 1727204048.64046: ANSIBALLZ: Done creating module 10215 1727204048.64080: variable 'ansible_facts' from source: unknown 10215 1727204048.64222: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494/AnsiballZ_systemd.py 10215 1727204048.64350: Sending initial data 10215 1727204048.64354: Sent initial data (156 bytes) 10215 1727204048.64846: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204048.64849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204048.64852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204048.64855: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204048.64902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204048.64905: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204048.64961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204048.66680: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 10215 1727204048.66684: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204048.66713: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204048.66753: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpiisa8qkq /root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494/AnsiballZ_systemd.py <<< 10215 1727204048.66757: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494/AnsiballZ_systemd.py" <<< 10215 1727204048.66787: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpiisa8qkq" to remote "/root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494/AnsiballZ_systemd.py" <<< 10215 1727204048.68477: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204048.68549: stderr chunk (state=3): >>><<< 10215 1727204048.68553: stdout chunk (state=3): >>><<< 10215 1727204048.68572: done transferring module to remote 10215 1727204048.68583: _low_level_execute_command(): starting 10215 1727204048.68592: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494/ /root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494/AnsiballZ_systemd.py && sleep 0' 10215 1727204048.69071: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204048.69076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204048.69078: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204048.69081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204048.69083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204048.69142: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204048.69145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204048.69185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204048.71035: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204048.71078: stderr chunk (state=3): >>><<< 10215 1727204048.71081: stdout chunk (state=3): >>><<< 10215 1727204048.71098: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204048.71101: _low_level_execute_command(): starting 10215 1727204048.71107: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494/AnsiballZ_systemd.py && sleep 0' 10215 1727204048.71561: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204048.71603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204048.71606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204048.71611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204048.71614: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204048.71658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204048.71665: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204048.71711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204049.04276: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11698176", "MemoryAvailable": "infinity", "CPUUsageNSec": "637856000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "inf<<< 10215 1727204049.04295: stdout chunk (state=3): >>>inity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service cloud-init.service network.service shutdown.target multi-user.target", "After": "systemd-journald.socket dbus-broker.service system.slice cloud-init-local.service basic.target dbus.socket sysinit.target network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "lo<<< 10215 1727204049.04331: stdout chunk (state=3): >>>aded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:14 EDT", "StateChangeTimestampMonotonic": "499035810", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10215 1727204049.06342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204049.06411: stderr chunk (state=3): >>><<< 10215 1727204049.06415: stdout chunk (state=3): >>><<< 10215 1727204049.06431: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11698176", "MemoryAvailable": "infinity", "CPUUsageNSec": "637856000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service cloud-init.service network.service shutdown.target multi-user.target", "After": "systemd-journald.socket dbus-broker.service system.slice cloud-init-local.service basic.target dbus.socket sysinit.target network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:14 EDT", "StateChangeTimestampMonotonic": "499035810", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204049.06605: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204049.06627: _low_level_execute_command(): starting 10215 1727204049.06631: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204048.2783945-11323-253608633595494/ > /dev/null 2>&1 && sleep 0' 10215 1727204049.07099: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204049.07105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204049.07135: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204049.07140: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204049.07199: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204049.07206: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204049.07208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204049.07244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204049.09131: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204049.09179: stderr chunk (state=3): >>><<< 10215 1727204049.09182: stdout chunk (state=3): >>><<< 10215 1727204049.09201: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204049.09208: handler run complete 10215 1727204049.09257: attempt loop complete, returning result 10215 1727204049.09260: _execute() done 10215 1727204049.09263: dumping result to json 10215 1727204049.09284: done dumping result, returning 10215 1727204049.09292: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-3c74-8f8e-000000000032] 10215 1727204049.09299: sending task result for task 12b410aa-8751-3c74-8f8e-000000000032 10215 1727204049.09573: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000032 10215 1727204049.09576: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10215 1727204049.09636: no more pending results, returning what we have 10215 1727204049.09640: results queue empty 10215 1727204049.09641: checking for any_errors_fatal 10215 1727204049.09648: done checking for any_errors_fatal 10215 1727204049.09648: checking for max_fail_percentage 10215 1727204049.09650: done checking for max_fail_percentage 10215 1727204049.09651: checking to see if all hosts have failed and the running result is not ok 10215 1727204049.09652: done checking to see if all hosts have failed 10215 1727204049.09653: getting the remaining hosts for this loop 10215 1727204049.09655: done getting the remaining hosts for this loop 10215 1727204049.09659: getting the next task for host managed-node3 10215 1727204049.09665: done getting next task for host managed-node3 10215 1727204049.09669: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10215 1727204049.09672: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204049.09683: getting variables 10215 1727204049.09685: in VariableManager get_vars() 10215 1727204049.09736: Calling all_inventory to load vars for managed-node3 10215 1727204049.09739: Calling groups_inventory to load vars for managed-node3 10215 1727204049.09741: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204049.09753: Calling all_plugins_play to load vars for managed-node3 10215 1727204049.09756: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204049.09759: Calling groups_plugins_play to load vars for managed-node3 10215 1727204049.11068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204049.12614: done with get_vars() 10215 1727204049.12636: done getting variables 10215 1727204049.12686: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:09 -0400 (0:00:01.088) 0:00:17.694 ***** 10215 1727204049.12715: entering _queue_task() for managed-node3/service 10215 1727204049.12969: worker is 1 (out of 1 available) 10215 1727204049.12983: exiting _queue_task() for managed-node3/service 10215 1727204049.12999: done queuing things up, now waiting for results queue to drain 10215 1727204049.13001: waiting for pending results... 10215 1727204049.13194: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10215 1727204049.13296: in run() - task 12b410aa-8751-3c74-8f8e-000000000033 10215 1727204049.13310: variable 'ansible_search_path' from source: unknown 10215 1727204049.13315: variable 'ansible_search_path' from source: unknown 10215 1727204049.13352: calling self._execute() 10215 1727204049.13427: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204049.13434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204049.13446: variable 'omit' from source: magic vars 10215 1727204049.13776: variable 'ansible_distribution_major_version' from source: facts 10215 1727204049.13788: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204049.13885: variable 'network_provider' from source: set_fact 10215 1727204049.13892: Evaluated conditional (network_provider == "nm"): True 10215 1727204049.13970: variable '__network_wpa_supplicant_required' from source: role '' defaults 10215 1727204049.14047: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10215 1727204049.14194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204049.15848: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204049.15905: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204049.15936: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204049.16059: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204049.16063: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204049.16071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204049.16102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204049.16125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204049.16157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204049.16171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204049.16216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204049.16236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204049.16256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204049.16288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204049.16305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204049.16341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204049.16361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204049.16382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204049.16417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204049.16430: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204049.16548: variable 'network_connections' from source: task vars 10215 1727204049.16559: variable 'controller_profile' from source: play vars 10215 1727204049.16614: variable 'controller_profile' from source: play vars 10215 1727204049.16627: variable 'controller_device' from source: play vars 10215 1727204049.16674: variable 'controller_device' from source: play vars 10215 1727204049.16683: variable 'port1_profile' from source: play vars 10215 1727204049.16738: variable 'port1_profile' from source: play vars 10215 1727204049.16746: variable 'dhcp_interface1' from source: play vars 10215 1727204049.16796: variable 'dhcp_interface1' from source: play vars 10215 1727204049.16803: variable 'controller_profile' from source: play vars 10215 1727204049.16855: variable 'controller_profile' from source: play vars 10215 1727204049.16863: variable 'port2_profile' from source: play vars 10215 1727204049.16915: variable 'port2_profile' from source: play vars 10215 1727204049.16923: variable 'dhcp_interface2' from source: play vars 10215 1727204049.16975: variable 'dhcp_interface2' from source: play vars 10215 1727204049.16981: variable 'controller_profile' from source: play vars 10215 1727204049.17033: variable 'controller_profile' from source: play vars 10215 1727204049.17094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204049.17225: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204049.17257: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204049.17285: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204049.17313: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204049.17348: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204049.17369: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204049.17393: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204049.17421: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204049.17465: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204049.17676: variable 'network_connections' from source: task vars 10215 1727204049.17681: variable 'controller_profile' from source: play vars 10215 1727204049.17737: variable 'controller_profile' from source: play vars 10215 1727204049.17744: variable 'controller_device' from source: play vars 10215 1727204049.17796: variable 'controller_device' from source: play vars 10215 1727204049.17805: variable 'port1_profile' from source: play vars 10215 1727204049.17858: variable 'port1_profile' from source: play vars 10215 1727204049.17865: variable 'dhcp_interface1' from source: play vars 10215 1727204049.17946: variable 'dhcp_interface1' from source: play vars 10215 1727204049.17949: variable 'controller_profile' from source: play vars 10215 1727204049.17992: variable 'controller_profile' from source: play vars 10215 1727204049.18002: variable 'port2_profile' from source: play vars 10215 1727204049.18052: variable 'port2_profile' from source: play vars 10215 1727204049.18064: variable 'dhcp_interface2' from source: play vars 10215 1727204049.18116: variable 'dhcp_interface2' from source: play vars 10215 1727204049.18123: variable 'controller_profile' from source: play vars 10215 1727204049.18178: variable 'controller_profile' from source: play vars 10215 1727204049.18215: Evaluated conditional (__network_wpa_supplicant_required): False 10215 1727204049.18219: when evaluation is False, skipping this task 10215 1727204049.18221: _execute() done 10215 1727204049.18226: dumping result to json 10215 1727204049.18233: done dumping result, returning 10215 1727204049.18239: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-3c74-8f8e-000000000033] 10215 1727204049.18245: sending task result for task 12b410aa-8751-3c74-8f8e-000000000033 10215 1727204049.18337: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000033 10215 1727204049.18341: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 10215 1727204049.18413: no more pending results, returning what we have 10215 1727204049.18418: results queue empty 10215 1727204049.18419: checking for any_errors_fatal 10215 1727204049.18448: done checking for any_errors_fatal 10215 1727204049.18449: checking for max_fail_percentage 10215 1727204049.18451: done checking for max_fail_percentage 10215 1727204049.18452: checking to see if all hosts have failed and the running result is not ok 10215 1727204049.18453: done checking to see if all hosts have failed 10215 1727204049.18454: getting the remaining hosts for this loop 10215 1727204049.18456: done getting the remaining hosts for this loop 10215 1727204049.18461: getting the next task for host managed-node3 10215 1727204049.18466: done getting next task for host managed-node3 10215 1727204049.18470: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 10215 1727204049.18473: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204049.18488: getting variables 10215 1727204049.18491: in VariableManager get_vars() 10215 1727204049.18534: Calling all_inventory to load vars for managed-node3 10215 1727204049.18537: Calling groups_inventory to load vars for managed-node3 10215 1727204049.18540: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204049.18551: Calling all_plugins_play to load vars for managed-node3 10215 1727204049.18554: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204049.18557: Calling groups_plugins_play to load vars for managed-node3 10215 1727204049.19804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204049.21858: done with get_vars() 10215 1727204049.21886: done getting variables 10215 1727204049.21938: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.092) 0:00:17.786 ***** 10215 1727204049.21966: entering _queue_task() for managed-node3/service 10215 1727204049.22216: worker is 1 (out of 1 available) 10215 1727204049.22231: exiting _queue_task() for managed-node3/service 10215 1727204049.22245: done queuing things up, now waiting for results queue to drain 10215 1727204049.22247: waiting for pending results... 10215 1727204049.22444: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 10215 1727204049.22540: in run() - task 12b410aa-8751-3c74-8f8e-000000000034 10215 1727204049.22554: variable 'ansible_search_path' from source: unknown 10215 1727204049.22558: variable 'ansible_search_path' from source: unknown 10215 1727204049.22596: calling self._execute() 10215 1727204049.22671: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204049.22678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204049.22689: variable 'omit' from source: magic vars 10215 1727204049.23163: variable 'ansible_distribution_major_version' from source: facts 10215 1727204049.23167: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204049.23450: variable 'network_provider' from source: set_fact 10215 1727204049.23453: Evaluated conditional (network_provider == "initscripts"): False 10215 1727204049.23456: when evaluation is False, skipping this task 10215 1727204049.23458: _execute() done 10215 1727204049.23460: dumping result to json 10215 1727204049.23462: done dumping result, returning 10215 1727204049.23469: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-3c74-8f8e-000000000034] 10215 1727204049.23482: sending task result for task 12b410aa-8751-3c74-8f8e-000000000034 10215 1727204049.23801: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000034 10215 1727204049.23805: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10215 1727204049.23939: no more pending results, returning what we have 10215 1727204049.23943: results queue empty 10215 1727204049.23944: checking for any_errors_fatal 10215 1727204049.23951: done checking for any_errors_fatal 10215 1727204049.23952: checking for max_fail_percentage 10215 1727204049.23954: done checking for max_fail_percentage 10215 1727204049.23954: checking to see if all hosts have failed and the running result is not ok 10215 1727204049.23956: done checking to see if all hosts have failed 10215 1727204049.23956: getting the remaining hosts for this loop 10215 1727204049.23958: done getting the remaining hosts for this loop 10215 1727204049.23961: getting the next task for host managed-node3 10215 1727204049.23967: done getting next task for host managed-node3 10215 1727204049.23971: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10215 1727204049.23974: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204049.23988: getting variables 10215 1727204049.23992: in VariableManager get_vars() 10215 1727204049.24030: Calling all_inventory to load vars for managed-node3 10215 1727204049.24034: Calling groups_inventory to load vars for managed-node3 10215 1727204049.24037: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204049.24048: Calling all_plugins_play to load vars for managed-node3 10215 1727204049.24052: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204049.24056: Calling groups_plugins_play to load vars for managed-node3 10215 1727204049.29121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204049.32798: done with get_vars() 10215 1727204049.32839: done getting variables 10215 1727204049.32925: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.109) 0:00:17.896 ***** 10215 1727204049.32966: entering _queue_task() for managed-node3/copy 10215 1727204049.33514: worker is 1 (out of 1 available) 10215 1727204049.33528: exiting _queue_task() for managed-node3/copy 10215 1727204049.33541: done queuing things up, now waiting for results queue to drain 10215 1727204049.33543: waiting for pending results... 10215 1727204049.33710: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10215 1727204049.34096: in run() - task 12b410aa-8751-3c74-8f8e-000000000035 10215 1727204049.34102: variable 'ansible_search_path' from source: unknown 10215 1727204049.34106: variable 'ansible_search_path' from source: unknown 10215 1727204049.34113: calling self._execute() 10215 1727204049.34116: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204049.34120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204049.34124: variable 'omit' from source: magic vars 10215 1727204049.34498: variable 'ansible_distribution_major_version' from source: facts 10215 1727204049.34512: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204049.34669: variable 'network_provider' from source: set_fact 10215 1727204049.34681: Evaluated conditional (network_provider == "initscripts"): False 10215 1727204049.34685: when evaluation is False, skipping this task 10215 1727204049.34688: _execute() done 10215 1727204049.34694: dumping result to json 10215 1727204049.34699: done dumping result, returning 10215 1727204049.34713: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-3c74-8f8e-000000000035] 10215 1727204049.34716: sending task result for task 12b410aa-8751-3c74-8f8e-000000000035 10215 1727204049.34830: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000035 10215 1727204049.34834: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10215 1727204049.34893: no more pending results, returning what we have 10215 1727204049.34898: results queue empty 10215 1727204049.34899: checking for any_errors_fatal 10215 1727204049.34907: done checking for any_errors_fatal 10215 1727204049.34908: checking for max_fail_percentage 10215 1727204049.34910: done checking for max_fail_percentage 10215 1727204049.34911: checking to see if all hosts have failed and the running result is not ok 10215 1727204049.34912: done checking to see if all hosts have failed 10215 1727204049.34913: getting the remaining hosts for this loop 10215 1727204049.34915: done getting the remaining hosts for this loop 10215 1727204049.34920: getting the next task for host managed-node3 10215 1727204049.34928: done getting next task for host managed-node3 10215 1727204049.34932: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10215 1727204049.34936: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204049.34953: getting variables 10215 1727204049.34955: in VariableManager get_vars() 10215 1727204049.35113: Calling all_inventory to load vars for managed-node3 10215 1727204049.35117: Calling groups_inventory to load vars for managed-node3 10215 1727204049.35120: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204049.35135: Calling all_plugins_play to load vars for managed-node3 10215 1727204049.35139: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204049.35143: Calling groups_plugins_play to load vars for managed-node3 10215 1727204049.37559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204049.40565: done with get_vars() 10215 1727204049.40607: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.077) 0:00:17.974 ***** 10215 1727204049.40723: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 10215 1727204049.40725: Creating lock for fedora.linux_system_roles.network_connections 10215 1727204049.41119: worker is 1 (out of 1 available) 10215 1727204049.41137: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 10215 1727204049.41151: done queuing things up, now waiting for results queue to drain 10215 1727204049.41153: waiting for pending results... 10215 1727204049.41512: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10215 1727204049.41672: in run() - task 12b410aa-8751-3c74-8f8e-000000000036 10215 1727204049.41676: variable 'ansible_search_path' from source: unknown 10215 1727204049.41679: variable 'ansible_search_path' from source: unknown 10215 1727204049.42095: calling self._execute() 10215 1727204049.42099: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204049.42102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204049.42105: variable 'omit' from source: magic vars 10215 1727204049.42258: variable 'ansible_distribution_major_version' from source: facts 10215 1727204049.42277: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204049.42291: variable 'omit' from source: magic vars 10215 1727204049.42363: variable 'omit' from source: magic vars 10215 1727204049.42569: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204049.45231: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204049.45309: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204049.45356: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204049.45398: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204049.45434: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204049.45530: variable 'network_provider' from source: set_fact 10215 1727204049.45698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204049.45746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204049.45778: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204049.45836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204049.45853: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204049.46095: variable 'omit' from source: magic vars 10215 1727204049.46099: variable 'omit' from source: magic vars 10215 1727204049.46223: variable 'network_connections' from source: task vars 10215 1727204049.46242: variable 'controller_profile' from source: play vars 10215 1727204049.46317: variable 'controller_profile' from source: play vars 10215 1727204049.46326: variable 'controller_device' from source: play vars 10215 1727204049.46410: variable 'controller_device' from source: play vars 10215 1727204049.46419: variable 'port1_profile' from source: play vars 10215 1727204049.46499: variable 'port1_profile' from source: play vars 10215 1727204049.46510: variable 'dhcp_interface1' from source: play vars 10215 1727204049.46586: variable 'dhcp_interface1' from source: play vars 10215 1727204049.46595: variable 'controller_profile' from source: play vars 10215 1727204049.46666: variable 'controller_profile' from source: play vars 10215 1727204049.46684: variable 'port2_profile' from source: play vars 10215 1727204049.46758: variable 'port2_profile' from source: play vars 10215 1727204049.46766: variable 'dhcp_interface2' from source: play vars 10215 1727204049.46846: variable 'dhcp_interface2' from source: play vars 10215 1727204049.46853: variable 'controller_profile' from source: play vars 10215 1727204049.46934: variable 'controller_profile' from source: play vars 10215 1727204049.47178: variable 'omit' from source: magic vars 10215 1727204049.47187: variable '__lsr_ansible_managed' from source: task vars 10215 1727204049.47268: variable '__lsr_ansible_managed' from source: task vars 10215 1727204049.47494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10215 1727204049.47894: Loaded config def from plugin (lookup/template) 10215 1727204049.47898: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 10215 1727204049.47901: File lookup term: get_ansible_managed.j2 10215 1727204049.47903: variable 'ansible_search_path' from source: unknown 10215 1727204049.47906: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 10215 1727204049.47913: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 10215 1727204049.47916: variable 'ansible_search_path' from source: unknown 10215 1727204049.58014: variable 'ansible_managed' from source: unknown 10215 1727204049.58596: variable 'omit' from source: magic vars 10215 1727204049.58600: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204049.58604: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204049.58609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204049.58612: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204049.58619: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204049.58621: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204049.58624: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204049.58626: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204049.58628: Set connection var ansible_connection to ssh 10215 1727204049.58630: Set connection var ansible_pipelining to False 10215 1727204049.58632: Set connection var ansible_shell_type to sh 10215 1727204049.58634: Set connection var ansible_timeout to 10 10215 1727204049.58636: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204049.58646: Set connection var ansible_shell_executable to /bin/sh 10215 1727204049.58648: variable 'ansible_shell_executable' from source: unknown 10215 1727204049.58650: variable 'ansible_connection' from source: unknown 10215 1727204049.58653: variable 'ansible_module_compression' from source: unknown 10215 1727204049.58655: variable 'ansible_shell_type' from source: unknown 10215 1727204049.58657: variable 'ansible_shell_executable' from source: unknown 10215 1727204049.58659: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204049.58661: variable 'ansible_pipelining' from source: unknown 10215 1727204049.58663: variable 'ansible_timeout' from source: unknown 10215 1727204049.58665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204049.58800: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204049.58812: variable 'omit' from source: magic vars 10215 1727204049.58820: starting attempt loop 10215 1727204049.58823: running the handler 10215 1727204049.58847: _low_level_execute_command(): starting 10215 1727204049.58854: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204049.59619: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204049.59723: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204049.59746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204049.59759: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204049.59784: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204049.59855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204049.61613: stdout chunk (state=3): >>>/root <<< 10215 1727204049.61815: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204049.61818: stdout chunk (state=3): >>><<< 10215 1727204049.61821: stderr chunk (state=3): >>><<< 10215 1727204049.61949: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204049.61953: _low_level_execute_command(): starting 10215 1727204049.61956: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126 `" && echo ansible-tmp-1727204049.618464-11377-59273227094126="` echo /root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126 `" ) && sleep 0' 10215 1727204049.62547: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204049.62562: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204049.62611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204049.62636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204049.62651: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204049.62745: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204049.62758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204049.62776: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204049.62802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204049.62873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204049.64901: stdout chunk (state=3): >>>ansible-tmp-1727204049.618464-11377-59273227094126=/root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126 <<< 10215 1727204049.65068: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204049.65101: stderr chunk (state=3): >>><<< 10215 1727204049.65105: stdout chunk (state=3): >>><<< 10215 1727204049.65129: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204049.618464-11377-59273227094126=/root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204049.65298: variable 'ansible_module_compression' from source: unknown 10215 1727204049.65301: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 10215 1727204049.65304: ANSIBALLZ: Acquiring lock 10215 1727204049.65309: ANSIBALLZ: Lock acquired: 139878724455408 10215 1727204049.65312: ANSIBALLZ: Creating module 10215 1727204049.96067: ANSIBALLZ: Writing module into payload 10215 1727204049.96562: ANSIBALLZ: Writing module 10215 1727204049.96604: ANSIBALLZ: Renaming module 10215 1727204049.96611: ANSIBALLZ: Done creating module 10215 1727204049.96637: variable 'ansible_facts' from source: unknown 10215 1727204049.96757: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126/AnsiballZ_network_connections.py 10215 1727204049.96921: Sending initial data 10215 1727204049.96924: Sent initial data (166 bytes) 10215 1727204049.97775: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204049.97829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204049.97902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204049.99638: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10215 1727204049.99674: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204049.99714: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204049.99772: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmp5b3taz39 /root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126/AnsiballZ_network_connections.py <<< 10215 1727204049.99776: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126/AnsiballZ_network_connections.py" <<< 10215 1727204049.99820: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmp5b3taz39" to remote "/root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126/AnsiballZ_network_connections.py" <<< 10215 1727204050.05373: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204050.05445: stderr chunk (state=3): >>><<< 10215 1727204050.05455: stdout chunk (state=3): >>><<< 10215 1727204050.05595: done transferring module to remote 10215 1727204050.05601: _low_level_execute_command(): starting 10215 1727204050.05604: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126/ /root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126/AnsiballZ_network_connections.py && sleep 0' 10215 1727204050.06324: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204050.06344: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204050.06376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204050.06397: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204050.06430: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204050.06563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204050.08514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204050.08525: stdout chunk (state=3): >>><<< 10215 1727204050.08537: stderr chunk (state=3): >>><<< 10215 1727204050.08558: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204050.08567: _low_level_execute_command(): starting 10215 1727204050.08577: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126/AnsiballZ_network_connections.py && sleep 0' 10215 1727204050.09186: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204050.09206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204050.09226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204050.09244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204050.09269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204050.09283: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204050.09309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204050.09330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10215 1727204050.09404: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204050.09438: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204050.09457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204050.09710: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204050.10051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204050.54682: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 38ec32a3-a35b-422a-b696-8ced2fcaef41\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0f7546ed-ab57-4407-aded-d224151d9f1f\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, bfcd0732-4903-4a21-9526-3b74a99394ee\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 38ec32a3-a35b-422a-b696-8ced2fcaef41 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0f7546ed-ab57-4407-aded-d224151d9f1f (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, bfcd0732-4903-4a21-9526-3b74a99394ee (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 10215 1727204050.56959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204050.56972: stdout chunk (state=3): >>><<< 10215 1727204050.56992: stderr chunk (state=3): >>><<< 10215 1727204050.57022: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 38ec32a3-a35b-422a-b696-8ced2fcaef41\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0f7546ed-ab57-4407-aded-d224151d9f1f\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, bfcd0732-4903-4a21-9526-3b74a99394ee\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 38ec32a3-a35b-422a-b696-8ced2fcaef41 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0f7546ed-ab57-4407-aded-d224151d9f1f (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, bfcd0732-4903-4a21-9526-3b74a99394ee (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204050.57125: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204050.57148: _low_level_execute_command(): starting 10215 1727204050.57166: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204049.618464-11377-59273227094126/ > /dev/null 2>&1 && sleep 0' 10215 1727204050.57837: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204050.57852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204050.57911: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204050.57966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204050.57984: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204050.58012: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204050.58091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204050.60398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204050.60402: stderr chunk (state=3): >>><<< 10215 1727204050.60404: stdout chunk (state=3): >>><<< 10215 1727204050.60429: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204050.60433: handler run complete 10215 1727204050.60478: attempt loop complete, returning result 10215 1727204050.60482: _execute() done 10215 1727204050.60484: dumping result to json 10215 1727204050.60497: done dumping result, returning 10215 1727204050.60512: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-3c74-8f8e-000000000036] 10215 1727204050.60515: sending task result for task 12b410aa-8751-3c74-8f8e-000000000036 10215 1727204050.60736: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000036 changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 38ec32a3-a35b-422a-b696-8ced2fcaef41 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0f7546ed-ab57-4407-aded-d224151d9f1f [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, bfcd0732-4903-4a21-9526-3b74a99394ee [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 38ec32a3-a35b-422a-b696-8ced2fcaef41 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0f7546ed-ab57-4407-aded-d224151d9f1f (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, bfcd0732-4903-4a21-9526-3b74a99394ee (not-active) 10215 1727204050.60946: no more pending results, returning what we have 10215 1727204050.60951: results queue empty 10215 1727204050.60952: checking for any_errors_fatal 10215 1727204050.60961: done checking for any_errors_fatal 10215 1727204050.60962: checking for max_fail_percentage 10215 1727204050.60964: done checking for max_fail_percentage 10215 1727204050.60965: checking to see if all hosts have failed and the running result is not ok 10215 1727204050.60967: done checking to see if all hosts have failed 10215 1727204050.60967: getting the remaining hosts for this loop 10215 1727204050.60969: done getting the remaining hosts for this loop 10215 1727204050.60975: getting the next task for host managed-node3 10215 1727204050.60982: done getting next task for host managed-node3 10215 1727204050.60987: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 10215 1727204050.61494: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204050.61510: getting variables 10215 1727204050.61513: in VariableManager get_vars() 10215 1727204050.61564: Calling all_inventory to load vars for managed-node3 10215 1727204050.61568: Calling groups_inventory to load vars for managed-node3 10215 1727204050.61571: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204050.61584: Calling all_plugins_play to load vars for managed-node3 10215 1727204050.61588: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204050.61594: Calling groups_plugins_play to load vars for managed-node3 10215 1727204050.62328: WORKER PROCESS EXITING 10215 1727204050.66065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204050.71066: done with get_vars() 10215 1727204050.71108: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:10 -0400 (0:00:01.304) 0:00:19.279 ***** 10215 1727204050.71219: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 10215 1727204050.71221: Creating lock for fedora.linux_system_roles.network_state 10215 1727204050.71628: worker is 1 (out of 1 available) 10215 1727204050.71642: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 10215 1727204050.71655: done queuing things up, now waiting for results queue to drain 10215 1727204050.71657: waiting for pending results... 10215 1727204050.71916: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 10215 1727204050.72059: in run() - task 12b410aa-8751-3c74-8f8e-000000000037 10215 1727204050.72076: variable 'ansible_search_path' from source: unknown 10215 1727204050.72080: variable 'ansible_search_path' from source: unknown 10215 1727204050.72121: calling self._execute() 10215 1727204050.72226: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204050.72234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204050.72259: variable 'omit' from source: magic vars 10215 1727204050.72962: variable 'ansible_distribution_major_version' from source: facts 10215 1727204050.72966: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204050.72969: variable 'network_state' from source: role '' defaults 10215 1727204050.72972: Evaluated conditional (network_state != {}): False 10215 1727204050.72974: when evaluation is False, skipping this task 10215 1727204050.72976: _execute() done 10215 1727204050.72978: dumping result to json 10215 1727204050.72980: done dumping result, returning 10215 1727204050.72983: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-3c74-8f8e-000000000037] 10215 1727204050.72985: sending task result for task 12b410aa-8751-3c74-8f8e-000000000037 10215 1727204050.73095: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000037 10215 1727204050.73097: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10215 1727204050.73253: no more pending results, returning what we have 10215 1727204050.73258: results queue empty 10215 1727204050.73259: checking for any_errors_fatal 10215 1727204050.73273: done checking for any_errors_fatal 10215 1727204050.73274: checking for max_fail_percentage 10215 1727204050.73276: done checking for max_fail_percentage 10215 1727204050.73278: checking to see if all hosts have failed and the running result is not ok 10215 1727204050.73279: done checking to see if all hosts have failed 10215 1727204050.73280: getting the remaining hosts for this loop 10215 1727204050.73282: done getting the remaining hosts for this loop 10215 1727204050.73287: getting the next task for host managed-node3 10215 1727204050.73297: done getting next task for host managed-node3 10215 1727204050.73301: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10215 1727204050.73305: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204050.73323: getting variables 10215 1727204050.73325: in VariableManager get_vars() 10215 1727204050.73371: Calling all_inventory to load vars for managed-node3 10215 1727204050.73375: Calling groups_inventory to load vars for managed-node3 10215 1727204050.73378: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204050.73785: Calling all_plugins_play to load vars for managed-node3 10215 1727204050.73793: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204050.73798: Calling groups_plugins_play to load vars for managed-node3 10215 1727204050.76436: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204050.79377: done with get_vars() 10215 1727204050.79418: done getting variables 10215 1727204050.79502: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.083) 0:00:19.362 ***** 10215 1727204050.79541: entering _queue_task() for managed-node3/debug 10215 1727204050.79987: worker is 1 (out of 1 available) 10215 1727204050.80012: exiting _queue_task() for managed-node3/debug 10215 1727204050.80024: done queuing things up, now waiting for results queue to drain 10215 1727204050.80027: waiting for pending results... 10215 1727204050.80231: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10215 1727204050.80355: in run() - task 12b410aa-8751-3c74-8f8e-000000000038 10215 1727204050.80362: variable 'ansible_search_path' from source: unknown 10215 1727204050.80365: variable 'ansible_search_path' from source: unknown 10215 1727204050.80401: calling self._execute() 10215 1727204050.80502: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204050.80509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204050.80516: variable 'omit' from source: magic vars 10215 1727204050.80929: variable 'ansible_distribution_major_version' from source: facts 10215 1727204050.80941: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204050.80952: variable 'omit' from source: magic vars 10215 1727204050.81018: variable 'omit' from source: magic vars 10215 1727204050.81061: variable 'omit' from source: magic vars 10215 1727204050.81109: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204050.81148: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204050.81169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204050.81194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204050.81211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204050.81251: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204050.81254: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204050.81256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204050.81379: Set connection var ansible_connection to ssh 10215 1727204050.81398: Set connection var ansible_pipelining to False 10215 1727204050.81401: Set connection var ansible_shell_type to sh 10215 1727204050.81404: Set connection var ansible_timeout to 10 10215 1727204050.81406: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204050.81426: Set connection var ansible_shell_executable to /bin/sh 10215 1727204050.81491: variable 'ansible_shell_executable' from source: unknown 10215 1727204050.81494: variable 'ansible_connection' from source: unknown 10215 1727204050.81497: variable 'ansible_module_compression' from source: unknown 10215 1727204050.81500: variable 'ansible_shell_type' from source: unknown 10215 1727204050.81504: variable 'ansible_shell_executable' from source: unknown 10215 1727204050.81506: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204050.81511: variable 'ansible_pipelining' from source: unknown 10215 1727204050.81513: variable 'ansible_timeout' from source: unknown 10215 1727204050.81516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204050.81632: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204050.81647: variable 'omit' from source: magic vars 10215 1727204050.81653: starting attempt loop 10215 1727204050.81656: running the handler 10215 1727204050.81819: variable '__network_connections_result' from source: set_fact 10215 1727204050.81944: handler run complete 10215 1727204050.81947: attempt loop complete, returning result 10215 1727204050.81952: _execute() done 10215 1727204050.81955: dumping result to json 10215 1727204050.81958: done dumping result, returning 10215 1727204050.81961: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-3c74-8f8e-000000000038] 10215 1727204050.81964: sending task result for task 12b410aa-8751-3c74-8f8e-000000000038 10215 1727204050.82251: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000038 10215 1727204050.82255: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 38ec32a3-a35b-422a-b696-8ced2fcaef41", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0f7546ed-ab57-4407-aded-d224151d9f1f", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, bfcd0732-4903-4a21-9526-3b74a99394ee", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 38ec32a3-a35b-422a-b696-8ced2fcaef41 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0f7546ed-ab57-4407-aded-d224151d9f1f (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, bfcd0732-4903-4a21-9526-3b74a99394ee (not-active)" ] } 10215 1727204050.82344: no more pending results, returning what we have 10215 1727204050.82348: results queue empty 10215 1727204050.82349: checking for any_errors_fatal 10215 1727204050.82354: done checking for any_errors_fatal 10215 1727204050.82355: checking for max_fail_percentage 10215 1727204050.82357: done checking for max_fail_percentage 10215 1727204050.82358: checking to see if all hosts have failed and the running result is not ok 10215 1727204050.82360: done checking to see if all hosts have failed 10215 1727204050.82361: getting the remaining hosts for this loop 10215 1727204050.82362: done getting the remaining hosts for this loop 10215 1727204050.82367: getting the next task for host managed-node3 10215 1727204050.82372: done getting next task for host managed-node3 10215 1727204050.82376: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10215 1727204050.82382: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204050.82395: getting variables 10215 1727204050.82397: in VariableManager get_vars() 10215 1727204050.82443: Calling all_inventory to load vars for managed-node3 10215 1727204050.82447: Calling groups_inventory to load vars for managed-node3 10215 1727204050.82450: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204050.82462: Calling all_plugins_play to load vars for managed-node3 10215 1727204050.82465: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204050.82469: Calling groups_plugins_play to load vars for managed-node3 10215 1727204050.84302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204050.87369: done with get_vars() 10215 1727204050.87417: done getting variables 10215 1727204050.87519: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.080) 0:00:19.442 ***** 10215 1727204050.87573: entering _queue_task() for managed-node3/debug 10215 1727204050.87972: worker is 1 (out of 1 available) 10215 1727204050.87988: exiting _queue_task() for managed-node3/debug 10215 1727204050.88011: done queuing things up, now waiting for results queue to drain 10215 1727204050.88013: waiting for pending results... 10215 1727204050.88193: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10215 1727204050.88293: in run() - task 12b410aa-8751-3c74-8f8e-000000000039 10215 1727204050.88309: variable 'ansible_search_path' from source: unknown 10215 1727204050.88313: variable 'ansible_search_path' from source: unknown 10215 1727204050.88344: calling self._execute() 10215 1727204050.88421: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204050.88426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204050.88438: variable 'omit' from source: magic vars 10215 1727204050.88752: variable 'ansible_distribution_major_version' from source: facts 10215 1727204050.88762: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204050.88770: variable 'omit' from source: magic vars 10215 1727204050.88825: variable 'omit' from source: magic vars 10215 1727204050.88855: variable 'omit' from source: magic vars 10215 1727204050.88890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204050.88925: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204050.88942: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204050.88958: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204050.88969: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204050.88998: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204050.89002: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204050.89005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204050.89207: Set connection var ansible_connection to ssh 10215 1727204050.89211: Set connection var ansible_pipelining to False 10215 1727204050.89214: Set connection var ansible_shell_type to sh 10215 1727204050.89219: Set connection var ansible_timeout to 10 10215 1727204050.89222: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204050.89248: Set connection var ansible_shell_executable to /bin/sh 10215 1727204050.89294: variable 'ansible_shell_executable' from source: unknown 10215 1727204050.89298: variable 'ansible_connection' from source: unknown 10215 1727204050.89301: variable 'ansible_module_compression' from source: unknown 10215 1727204050.89303: variable 'ansible_shell_type' from source: unknown 10215 1727204050.89306: variable 'ansible_shell_executable' from source: unknown 10215 1727204050.89308: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204050.89363: variable 'ansible_pipelining' from source: unknown 10215 1727204050.89366: variable 'ansible_timeout' from source: unknown 10215 1727204050.89369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204050.89599: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204050.89603: variable 'omit' from source: magic vars 10215 1727204050.89606: starting attempt loop 10215 1727204050.89609: running the handler 10215 1727204050.89611: variable '__network_connections_result' from source: set_fact 10215 1727204050.89677: variable '__network_connections_result' from source: set_fact 10215 1727204050.90097: handler run complete 10215 1727204050.90151: attempt loop complete, returning result 10215 1727204050.90155: _execute() done 10215 1727204050.90158: dumping result to json 10215 1727204050.90166: done dumping result, returning 10215 1727204050.90176: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-3c74-8f8e-000000000039] 10215 1727204050.90183: sending task result for task 12b410aa-8751-3c74-8f8e-000000000039 ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 38ec32a3-a35b-422a-b696-8ced2fcaef41\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0f7546ed-ab57-4407-aded-d224151d9f1f\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, bfcd0732-4903-4a21-9526-3b74a99394ee\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 38ec32a3-a35b-422a-b696-8ced2fcaef41 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0f7546ed-ab57-4407-aded-d224151d9f1f (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, bfcd0732-4903-4a21-9526-3b74a99394ee (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 38ec32a3-a35b-422a-b696-8ced2fcaef41", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 0f7546ed-ab57-4407-aded-d224151d9f1f", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, bfcd0732-4903-4a21-9526-3b74a99394ee", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 38ec32a3-a35b-422a-b696-8ced2fcaef41 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 0f7546ed-ab57-4407-aded-d224151d9f1f (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, bfcd0732-4903-4a21-9526-3b74a99394ee (not-active)" ] } } 10215 1727204050.90546: no more pending results, returning what we have 10215 1727204050.90554: results queue empty 10215 1727204050.90556: checking for any_errors_fatal 10215 1727204050.90562: done checking for any_errors_fatal 10215 1727204050.90563: checking for max_fail_percentage 10215 1727204050.90571: done checking for max_fail_percentage 10215 1727204050.90573: checking to see if all hosts have failed and the running result is not ok 10215 1727204050.90574: done checking to see if all hosts have failed 10215 1727204050.90575: getting the remaining hosts for this loop 10215 1727204050.90576: done getting the remaining hosts for this loop 10215 1727204050.90582: getting the next task for host managed-node3 10215 1727204050.90882: done getting next task for host managed-node3 10215 1727204050.90886: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10215 1727204050.90892: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204050.90904: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000039 10215 1727204050.90907: WORKER PROCESS EXITING 10215 1727204050.90919: getting variables 10215 1727204050.90921: in VariableManager get_vars() 10215 1727204050.90965: Calling all_inventory to load vars for managed-node3 10215 1727204050.90969: Calling groups_inventory to load vars for managed-node3 10215 1727204050.90972: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204050.90984: Calling all_plugins_play to load vars for managed-node3 10215 1727204050.90988: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204050.90995: Calling groups_plugins_play to load vars for managed-node3 10215 1727204050.92821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204050.94644: done with get_vars() 10215 1727204050.94679: done getting variables 10215 1727204050.94754: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.072) 0:00:19.515 ***** 10215 1727204050.94794: entering _queue_task() for managed-node3/debug 10215 1727204050.95152: worker is 1 (out of 1 available) 10215 1727204050.95171: exiting _queue_task() for managed-node3/debug 10215 1727204050.95187: done queuing things up, now waiting for results queue to drain 10215 1727204050.95217: waiting for pending results... 10215 1727204050.95430: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10215 1727204050.95535: in run() - task 12b410aa-8751-3c74-8f8e-00000000003a 10215 1727204050.95551: variable 'ansible_search_path' from source: unknown 10215 1727204050.95555: variable 'ansible_search_path' from source: unknown 10215 1727204050.95588: calling self._execute() 10215 1727204050.95666: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204050.95671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204050.95683: variable 'omit' from source: magic vars 10215 1727204050.96002: variable 'ansible_distribution_major_version' from source: facts 10215 1727204050.96020: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204050.96120: variable 'network_state' from source: role '' defaults 10215 1727204050.96132: Evaluated conditional (network_state != {}): False 10215 1727204050.96136: when evaluation is False, skipping this task 10215 1727204050.96139: _execute() done 10215 1727204050.96142: dumping result to json 10215 1727204050.96146: done dumping result, returning 10215 1727204050.96155: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-3c74-8f8e-00000000003a] 10215 1727204050.96161: sending task result for task 12b410aa-8751-3c74-8f8e-00000000003a 10215 1727204050.96253: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000003a 10215 1727204050.96256: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 10215 1727204050.96308: no more pending results, returning what we have 10215 1727204050.96313: results queue empty 10215 1727204050.96314: checking for any_errors_fatal 10215 1727204050.96324: done checking for any_errors_fatal 10215 1727204050.96326: checking for max_fail_percentage 10215 1727204050.96327: done checking for max_fail_percentage 10215 1727204050.96329: checking to see if all hosts have failed and the running result is not ok 10215 1727204050.96330: done checking to see if all hosts have failed 10215 1727204050.96331: getting the remaining hosts for this loop 10215 1727204050.96332: done getting the remaining hosts for this loop 10215 1727204050.96337: getting the next task for host managed-node3 10215 1727204050.96345: done getting next task for host managed-node3 10215 1727204050.96349: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 10215 1727204050.96352: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204050.96368: getting variables 10215 1727204050.96369: in VariableManager get_vars() 10215 1727204050.96408: Calling all_inventory to load vars for managed-node3 10215 1727204050.96411: Calling groups_inventory to load vars for managed-node3 10215 1727204050.96413: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204050.96424: Calling all_plugins_play to load vars for managed-node3 10215 1727204050.96427: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204050.96430: Calling groups_plugins_play to load vars for managed-node3 10215 1727204050.97664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204050.99204: done with get_vars() 10215 1727204050.99225: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.045) 0:00:19.560 ***** 10215 1727204050.99304: entering _queue_task() for managed-node3/ping 10215 1727204050.99305: Creating lock for ping 10215 1727204050.99522: worker is 1 (out of 1 available) 10215 1727204050.99539: exiting _queue_task() for managed-node3/ping 10215 1727204050.99552: done queuing things up, now waiting for results queue to drain 10215 1727204050.99554: waiting for pending results... 10215 1727204050.99739: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 10215 1727204050.99833: in run() - task 12b410aa-8751-3c74-8f8e-00000000003b 10215 1727204050.99847: variable 'ansible_search_path' from source: unknown 10215 1727204050.99850: variable 'ansible_search_path' from source: unknown 10215 1727204050.99882: calling self._execute() 10215 1727204050.99958: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204050.99964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204050.99975: variable 'omit' from source: magic vars 10215 1727204051.00277: variable 'ansible_distribution_major_version' from source: facts 10215 1727204051.00287: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204051.00295: variable 'omit' from source: magic vars 10215 1727204051.00348: variable 'omit' from source: magic vars 10215 1727204051.00376: variable 'omit' from source: magic vars 10215 1727204051.00415: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204051.00446: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204051.00465: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204051.00480: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204051.00492: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204051.00522: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204051.00526: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204051.00529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204051.00616: Set connection var ansible_connection to ssh 10215 1727204051.00622: Set connection var ansible_pipelining to False 10215 1727204051.00629: Set connection var ansible_shell_type to sh 10215 1727204051.00635: Set connection var ansible_timeout to 10 10215 1727204051.00642: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204051.00652: Set connection var ansible_shell_executable to /bin/sh 10215 1727204051.00674: variable 'ansible_shell_executable' from source: unknown 10215 1727204051.00677: variable 'ansible_connection' from source: unknown 10215 1727204051.00680: variable 'ansible_module_compression' from source: unknown 10215 1727204051.00683: variable 'ansible_shell_type' from source: unknown 10215 1727204051.00687: variable 'ansible_shell_executable' from source: unknown 10215 1727204051.00692: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204051.00698: variable 'ansible_pipelining' from source: unknown 10215 1727204051.00701: variable 'ansible_timeout' from source: unknown 10215 1727204051.00706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204051.00878: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204051.00888: variable 'omit' from source: magic vars 10215 1727204051.00895: starting attempt loop 10215 1727204051.00898: running the handler 10215 1727204051.00914: _low_level_execute_command(): starting 10215 1727204051.00921: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204051.01467: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204051.01471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 10215 1727204051.01477: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204051.01532: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204051.01535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204051.01582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204051.03333: stdout chunk (state=3): >>>/root <<< 10215 1727204051.03443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204051.03502: stderr chunk (state=3): >>><<< 10215 1727204051.03505: stdout chunk (state=3): >>><<< 10215 1727204051.03528: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204051.03540: _low_level_execute_command(): starting 10215 1727204051.03547: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092 `" && echo ansible-tmp-1727204051.035279-11439-270885530479092="` echo /root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092 `" ) && sleep 0' 10215 1727204051.03994: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204051.04012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204051.04016: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204051.04035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204051.04095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204051.04098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204051.04120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204051.04145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204051.06119: stdout chunk (state=3): >>>ansible-tmp-1727204051.035279-11439-270885530479092=/root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092 <<< 10215 1727204051.06238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204051.06283: stderr chunk (state=3): >>><<< 10215 1727204051.06286: stdout chunk (state=3): >>><<< 10215 1727204051.06305: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204051.035279-11439-270885530479092=/root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204051.06352: variable 'ansible_module_compression' from source: unknown 10215 1727204051.06388: ANSIBALLZ: Using lock for ping 10215 1727204051.06393: ANSIBALLZ: Acquiring lock 10215 1727204051.06396: ANSIBALLZ: Lock acquired: 139878722734320 10215 1727204051.06402: ANSIBALLZ: Creating module 10215 1727204051.18786: ANSIBALLZ: Writing module into payload 10215 1727204051.18794: ANSIBALLZ: Writing module 10215 1727204051.18818: ANSIBALLZ: Renaming module 10215 1727204051.18824: ANSIBALLZ: Done creating module 10215 1727204051.18847: variable 'ansible_facts' from source: unknown 10215 1727204051.18955: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092/AnsiballZ_ping.py 10215 1727204051.19269: Sending initial data 10215 1727204051.19273: Sent initial data (152 bytes) 10215 1727204051.20244: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204051.20263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204051.20334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204051.22192: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10215 1727204051.22229: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204051.22585: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204051.22618: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpdzozc4_3 /root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092/AnsiballZ_ping.py <<< 10215 1727204051.22622: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092/AnsiballZ_ping.py" <<< 10215 1727204051.22682: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpdzozc4_3" to remote "/root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092/AnsiballZ_ping.py" <<< 10215 1727204051.24681: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204051.24838: stderr chunk (state=3): >>><<< 10215 1727204051.24842: stdout chunk (state=3): >>><<< 10215 1727204051.24845: done transferring module to remote 10215 1727204051.24847: _low_level_execute_command(): starting 10215 1727204051.24850: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092/ /root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092/AnsiballZ_ping.py && sleep 0' 10215 1727204051.25924: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204051.25939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204051.25952: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204051.26134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204051.26305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204051.26425: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204051.26457: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204051.28402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204051.28416: stdout chunk (state=3): >>><<< 10215 1727204051.28428: stderr chunk (state=3): >>><<< 10215 1727204051.28449: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204051.28458: _low_level_execute_command(): starting 10215 1727204051.28694: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092/AnsiballZ_ping.py && sleep 0' 10215 1727204051.29807: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204051.29873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204051.29888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204051.30005: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10215 1727204051.30098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204051.30102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204051.30173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204051.47366: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 10215 1727204051.48834: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204051.48962: stderr chunk (state=3): >>><<< 10215 1727204051.48976: stdout chunk (state=3): >>><<< 10215 1727204051.49012: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204051.49199: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204051.49203: _low_level_execute_command(): starting 10215 1727204051.49206: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204051.035279-11439-270885530479092/ > /dev/null 2>&1 && sleep 0' 10215 1727204051.50447: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204051.50494: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204051.52437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204051.52567: stderr chunk (state=3): >>><<< 10215 1727204051.52625: stdout chunk (state=3): >>><<< 10215 1727204051.52648: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204051.52682: handler run complete 10215 1727204051.52747: attempt loop complete, returning result 10215 1727204051.52755: _execute() done 10215 1727204051.52890: dumping result to json 10215 1727204051.52893: done dumping result, returning 10215 1727204051.52898: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-3c74-8f8e-00000000003b] 10215 1727204051.52900: sending task result for task 12b410aa-8751-3c74-8f8e-00000000003b ok: [managed-node3] => { "changed": false, "ping": "pong" } 10215 1727204051.53166: no more pending results, returning what we have 10215 1727204051.53171: results queue empty 10215 1727204051.53172: checking for any_errors_fatal 10215 1727204051.53180: done checking for any_errors_fatal 10215 1727204051.53181: checking for max_fail_percentage 10215 1727204051.53182: done checking for max_fail_percentage 10215 1727204051.53183: checking to see if all hosts have failed and the running result is not ok 10215 1727204051.53185: done checking to see if all hosts have failed 10215 1727204051.53185: getting the remaining hosts for this loop 10215 1727204051.53188: done getting the remaining hosts for this loop 10215 1727204051.53195: getting the next task for host managed-node3 10215 1727204051.53207: done getting next task for host managed-node3 10215 1727204051.53210: ^ task is: TASK: meta (role_complete) 10215 1727204051.53213: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204051.53227: getting variables 10215 1727204051.53229: in VariableManager get_vars() 10215 1727204051.53278: Calling all_inventory to load vars for managed-node3 10215 1727204051.53282: Calling groups_inventory to load vars for managed-node3 10215 1727204051.53286: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204051.53705: Calling all_plugins_play to load vars for managed-node3 10215 1727204051.53709: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204051.53714: Calling groups_plugins_play to load vars for managed-node3 10215 1727204051.54351: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000003b 10215 1727204051.54355: WORKER PROCESS EXITING 10215 1727204051.59036: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204051.62995: done with get_vars() 10215 1727204051.63050: done getting variables 10215 1727204051.63159: done queuing things up, now waiting for results queue to drain 10215 1727204051.63162: results queue empty 10215 1727204051.63163: checking for any_errors_fatal 10215 1727204051.63166: done checking for any_errors_fatal 10215 1727204051.63167: checking for max_fail_percentage 10215 1727204051.63169: done checking for max_fail_percentage 10215 1727204051.63170: checking to see if all hosts have failed and the running result is not ok 10215 1727204051.63171: done checking to see if all hosts have failed 10215 1727204051.63172: getting the remaining hosts for this loop 10215 1727204051.63173: done getting the remaining hosts for this loop 10215 1727204051.63176: getting the next task for host managed-node3 10215 1727204051.63182: done getting next task for host managed-node3 10215 1727204051.63185: ^ task is: TASK: Include the task 'get_interface_stat.yml' 10215 1727204051.63187: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204051.63192: getting variables 10215 1727204051.63193: in VariableManager get_vars() 10215 1727204051.63211: Calling all_inventory to load vars for managed-node3 10215 1727204051.63214: Calling groups_inventory to load vars for managed-node3 10215 1727204051.63222: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204051.63228: Calling all_plugins_play to load vars for managed-node3 10215 1727204051.63231: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204051.63235: Calling groups_plugins_play to load vars for managed-node3 10215 1727204051.66514: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204051.69810: done with get_vars() 10215 1727204051.69849: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.706) 0:00:20.266 ***** 10215 1727204051.69949: entering _queue_task() for managed-node3/include_tasks 10215 1727204051.71068: worker is 1 (out of 1 available) 10215 1727204051.71083: exiting _queue_task() for managed-node3/include_tasks 10215 1727204051.71098: done queuing things up, now waiting for results queue to drain 10215 1727204051.71100: waiting for pending results... 10215 1727204051.71702: running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' 10215 1727204051.72119: in run() - task 12b410aa-8751-3c74-8f8e-00000000006e 10215 1727204051.72594: variable 'ansible_search_path' from source: unknown 10215 1727204051.72599: variable 'ansible_search_path' from source: unknown 10215 1727204051.72602: calling self._execute() 10215 1727204051.72679: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204051.72996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204051.73000: variable 'omit' from source: magic vars 10215 1727204051.73565: variable 'ansible_distribution_major_version' from source: facts 10215 1727204051.73994: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204051.73998: _execute() done 10215 1727204051.74001: dumping result to json 10215 1727204051.74004: done dumping result, returning 10215 1727204051.74006: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-3c74-8f8e-00000000006e] 10215 1727204051.74012: sending task result for task 12b410aa-8751-3c74-8f8e-00000000006e 10215 1727204051.74101: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000006e 10215 1727204051.74105: WORKER PROCESS EXITING 10215 1727204051.74142: no more pending results, returning what we have 10215 1727204051.74149: in VariableManager get_vars() 10215 1727204051.74205: Calling all_inventory to load vars for managed-node3 10215 1727204051.74208: Calling groups_inventory to load vars for managed-node3 10215 1727204051.74212: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204051.74229: Calling all_plugins_play to load vars for managed-node3 10215 1727204051.74233: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204051.74237: Calling groups_plugins_play to load vars for managed-node3 10215 1727204051.77492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204051.81104: done with get_vars() 10215 1727204051.81150: variable 'ansible_search_path' from source: unknown 10215 1727204051.81152: variable 'ansible_search_path' from source: unknown 10215 1727204051.81204: we have included files to process 10215 1727204051.81206: generating all_blocks data 10215 1727204051.81210: done generating all_blocks data 10215 1727204051.81216: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10215 1727204051.81217: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10215 1727204051.81220: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 10215 1727204051.81524: done processing included file 10215 1727204051.81526: iterating over new_blocks loaded from include file 10215 1727204051.81528: in VariableManager get_vars() 10215 1727204051.81555: done with get_vars() 10215 1727204051.81557: filtering new block on tags 10215 1727204051.81592: done filtering new block on tags 10215 1727204051.81603: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node3 10215 1727204051.81612: extending task lists for all hosts with included blocks 10215 1727204051.81770: done extending task lists 10215 1727204051.81772: done processing included files 10215 1727204051.81773: results queue empty 10215 1727204051.81774: checking for any_errors_fatal 10215 1727204051.81776: done checking for any_errors_fatal 10215 1727204051.81777: checking for max_fail_percentage 10215 1727204051.81778: done checking for max_fail_percentage 10215 1727204051.81779: checking to see if all hosts have failed and the running result is not ok 10215 1727204051.81780: done checking to see if all hosts have failed 10215 1727204051.81783: getting the remaining hosts for this loop 10215 1727204051.81785: done getting the remaining hosts for this loop 10215 1727204051.81788: getting the next task for host managed-node3 10215 1727204051.81828: done getting next task for host managed-node3 10215 1727204051.81831: ^ task is: TASK: Get stat for interface {{ interface }} 10215 1727204051.81835: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204051.81838: getting variables 10215 1727204051.81839: in VariableManager get_vars() 10215 1727204051.81856: Calling all_inventory to load vars for managed-node3 10215 1727204051.81858: Calling groups_inventory to load vars for managed-node3 10215 1727204051.81861: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204051.81868: Calling all_plugins_play to load vars for managed-node3 10215 1727204051.81872: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204051.81876: Calling groups_plugins_play to load vars for managed-node3 10215 1727204051.96625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204052.03283: done with get_vars() 10215 1727204052.03330: done getting variables 10215 1727204052.03743: variable 'interface' from source: task vars 10215 1727204052.03747: variable 'controller_device' from source: play vars 10215 1727204052.03933: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.340) 0:00:20.606 ***** 10215 1727204052.03968: entering _queue_task() for managed-node3/stat 10215 1727204052.04756: worker is 1 (out of 1 available) 10215 1727204052.04771: exiting _queue_task() for managed-node3/stat 10215 1727204052.04784: done queuing things up, now waiting for results queue to drain 10215 1727204052.04786: waiting for pending results... 10215 1727204052.05203: running TaskExecutor() for managed-node3/TASK: Get stat for interface nm-bond 10215 1727204052.05697: in run() - task 12b410aa-8751-3c74-8f8e-000000000241 10215 1727204052.05701: variable 'ansible_search_path' from source: unknown 10215 1727204052.05704: variable 'ansible_search_path' from source: unknown 10215 1727204052.05707: calling self._execute() 10215 1727204052.06095: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.06099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.06102: variable 'omit' from source: magic vars 10215 1727204052.06753: variable 'ansible_distribution_major_version' from source: facts 10215 1727204052.06777: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204052.06792: variable 'omit' from source: magic vars 10215 1727204052.06869: variable 'omit' from source: magic vars 10215 1727204052.06997: variable 'interface' from source: task vars 10215 1727204052.07010: variable 'controller_device' from source: play vars 10215 1727204052.07093: variable 'controller_device' from source: play vars 10215 1727204052.07125: variable 'omit' from source: magic vars 10215 1727204052.07178: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204052.07232: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204052.07261: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204052.07288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204052.07311: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204052.07359: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204052.07369: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.07378: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.07512: Set connection var ansible_connection to ssh 10215 1727204052.07527: Set connection var ansible_pipelining to False 10215 1727204052.07546: Set connection var ansible_shell_type to sh 10215 1727204052.07559: Set connection var ansible_timeout to 10 10215 1727204052.07573: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204052.07591: Set connection var ansible_shell_executable to /bin/sh 10215 1727204052.07624: variable 'ansible_shell_executable' from source: unknown 10215 1727204052.07632: variable 'ansible_connection' from source: unknown 10215 1727204052.07639: variable 'ansible_module_compression' from source: unknown 10215 1727204052.07649: variable 'ansible_shell_type' from source: unknown 10215 1727204052.07662: variable 'ansible_shell_executable' from source: unknown 10215 1727204052.07671: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.07680: variable 'ansible_pipelining' from source: unknown 10215 1727204052.07692: variable 'ansible_timeout' from source: unknown 10215 1727204052.07701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.07949: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204052.07968: variable 'omit' from source: magic vars 10215 1727204052.07979: starting attempt loop 10215 1727204052.07987: running the handler 10215 1727204052.08012: _low_level_execute_command(): starting 10215 1727204052.08030: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204052.08810: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204052.08917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.08946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204052.08963: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204052.08993: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204052.09107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204052.10904: stdout chunk (state=3): >>>/root <<< 10215 1727204052.11012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204052.11078: stderr chunk (state=3): >>><<< 10215 1727204052.11091: stdout chunk (state=3): >>><<< 10215 1727204052.11199: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204052.11203: _low_level_execute_command(): starting 10215 1727204052.11206: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366 `" && echo ansible-tmp-1727204052.1112962-11475-149336357347366="` echo /root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366 `" ) && sleep 0' 10215 1727204052.11820: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204052.11882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.11966: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204052.11994: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204052.12017: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204052.12091: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204052.14084: stdout chunk (state=3): >>>ansible-tmp-1727204052.1112962-11475-149336357347366=/root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366 <<< 10215 1727204052.14396: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204052.14399: stdout chunk (state=3): >>><<< 10215 1727204052.14402: stderr chunk (state=3): >>><<< 10215 1727204052.14404: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204052.1112962-11475-149336357347366=/root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204052.14407: variable 'ansible_module_compression' from source: unknown 10215 1727204052.14437: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10215 1727204052.14484: variable 'ansible_facts' from source: unknown 10215 1727204052.14585: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366/AnsiballZ_stat.py 10215 1727204052.14770: Sending initial data 10215 1727204052.14773: Sent initial data (153 bytes) 10215 1727204052.15535: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.15541: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204052.15544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204052.15592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204052.15618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204052.17247: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204052.17278: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204052.17323: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmp16fsc16a /root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366/AnsiballZ_stat.py <<< 10215 1727204052.17328: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366/AnsiballZ_stat.py" <<< 10215 1727204052.17351: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmp16fsc16a" to remote "/root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366/AnsiballZ_stat.py" <<< 10215 1727204052.18395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204052.18399: stdout chunk (state=3): >>><<< 10215 1727204052.18403: stderr chunk (state=3): >>><<< 10215 1727204052.18406: done transferring module to remote 10215 1727204052.18410: _low_level_execute_command(): starting 10215 1727204052.18413: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366/ /root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366/AnsiballZ_stat.py && sleep 0' 10215 1727204052.19156: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204052.19166: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204052.19178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204052.19295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204052.19299: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204052.19302: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204052.19305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.19307: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10215 1727204052.19309: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 10215 1727204052.19311: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10215 1727204052.19313: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204052.19315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204052.19318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.19498: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204052.19506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204052.19707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204052.19820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204052.21741: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204052.21806: stderr chunk (state=3): >>><<< 10215 1727204052.21810: stdout chunk (state=3): >>><<< 10215 1727204052.21832: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204052.21835: _low_level_execute_command(): starting 10215 1727204052.21841: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366/AnsiballZ_stat.py && sleep 0' 10215 1727204052.22513: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204052.22518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204052.22602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204052.40320: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35677, "dev": 23, "nlink": 1, "atime": 1727204050.395141, "mtime": 1727204050.395141, "ctime": 1727204050.395141, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10215 1727204052.41737: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204052.41795: stderr chunk (state=3): >>><<< 10215 1727204052.41798: stdout chunk (state=3): >>><<< 10215 1727204052.41819: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 35677, "dev": 23, "nlink": 1, "atime": 1727204050.395141, "mtime": 1727204050.395141, "ctime": 1727204050.395141, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204052.41869: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204052.41879: _low_level_execute_command(): starting 10215 1727204052.41885: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204052.1112962-11475-149336357347366/ > /dev/null 2>&1 && sleep 0' 10215 1727204052.42378: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204052.42382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.42391: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204052.42394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.42444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204052.42448: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204052.42492: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204052.44499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204052.44503: stdout chunk (state=3): >>><<< 10215 1727204052.44505: stderr chunk (state=3): >>><<< 10215 1727204052.44557: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204052.44560: handler run complete 10215 1727204052.44609: attempt loop complete, returning result 10215 1727204052.44616: _execute() done 10215 1727204052.44619: dumping result to json 10215 1727204052.44630: done dumping result, returning 10215 1727204052.44665: done running TaskExecutor() for managed-node3/TASK: Get stat for interface nm-bond [12b410aa-8751-3c74-8f8e-000000000241] 10215 1727204052.44669: sending task result for task 12b410aa-8751-3c74-8f8e-000000000241 10215 1727204052.44782: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000241 10215 1727204052.44785: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "atime": 1727204050.395141, "block_size": 4096, "blocks": 0, "ctime": 1727204050.395141, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 35677, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1727204050.395141, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 10215 1727204052.45009: no more pending results, returning what we have 10215 1727204052.45013: results queue empty 10215 1727204052.45014: checking for any_errors_fatal 10215 1727204052.45016: done checking for any_errors_fatal 10215 1727204052.45016: checking for max_fail_percentage 10215 1727204052.45018: done checking for max_fail_percentage 10215 1727204052.45019: checking to see if all hosts have failed and the running result is not ok 10215 1727204052.45020: done checking to see if all hosts have failed 10215 1727204052.45021: getting the remaining hosts for this loop 10215 1727204052.45023: done getting the remaining hosts for this loop 10215 1727204052.45027: getting the next task for host managed-node3 10215 1727204052.45035: done getting next task for host managed-node3 10215 1727204052.45038: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 10215 1727204052.45041: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204052.45045: getting variables 10215 1727204052.45046: in VariableManager get_vars() 10215 1727204052.45096: Calling all_inventory to load vars for managed-node3 10215 1727204052.45100: Calling groups_inventory to load vars for managed-node3 10215 1727204052.45104: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204052.45120: Calling all_plugins_play to load vars for managed-node3 10215 1727204052.45124: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204052.45128: Calling groups_plugins_play to load vars for managed-node3 10215 1727204052.47687: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204052.51887: done with get_vars() 10215 1727204052.51932: done getting variables 10215 1727204052.52005: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204052.52156: variable 'interface' from source: task vars 10215 1727204052.52160: variable 'controller_device' from source: play vars 10215 1727204052.52240: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.483) 0:00:21.089 ***** 10215 1727204052.52276: entering _queue_task() for managed-node3/assert 10215 1727204052.52637: worker is 1 (out of 1 available) 10215 1727204052.52650: exiting _queue_task() for managed-node3/assert 10215 1727204052.52661: done queuing things up, now waiting for results queue to drain 10215 1727204052.52663: waiting for pending results... 10215 1727204052.53206: running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'nm-bond' 10215 1727204052.53215: in run() - task 12b410aa-8751-3c74-8f8e-00000000006f 10215 1727204052.53222: variable 'ansible_search_path' from source: unknown 10215 1727204052.53225: variable 'ansible_search_path' from source: unknown 10215 1727204052.53229: calling self._execute() 10215 1727204052.53320: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.53342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.53360: variable 'omit' from source: magic vars 10215 1727204052.53816: variable 'ansible_distribution_major_version' from source: facts 10215 1727204052.53834: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204052.53846: variable 'omit' from source: magic vars 10215 1727204052.53921: variable 'omit' from source: magic vars 10215 1727204052.54050: variable 'interface' from source: task vars 10215 1727204052.54061: variable 'controller_device' from source: play vars 10215 1727204052.54148: variable 'controller_device' from source: play vars 10215 1727204052.54175: variable 'omit' from source: magic vars 10215 1727204052.54230: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204052.54307: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204052.54310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204052.54330: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204052.54348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204052.54384: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204052.54395: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.54402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.54553: Set connection var ansible_connection to ssh 10215 1727204052.54633: Set connection var ansible_pipelining to False 10215 1727204052.54636: Set connection var ansible_shell_type to sh 10215 1727204052.54644: Set connection var ansible_timeout to 10 10215 1727204052.54646: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204052.54649: Set connection var ansible_shell_executable to /bin/sh 10215 1727204052.54651: variable 'ansible_shell_executable' from source: unknown 10215 1727204052.54653: variable 'ansible_connection' from source: unknown 10215 1727204052.54662: variable 'ansible_module_compression' from source: unknown 10215 1727204052.54670: variable 'ansible_shell_type' from source: unknown 10215 1727204052.54677: variable 'ansible_shell_executable' from source: unknown 10215 1727204052.54684: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.54696: variable 'ansible_pipelining' from source: unknown 10215 1727204052.54704: variable 'ansible_timeout' from source: unknown 10215 1727204052.54714: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.54906: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204052.54924: variable 'omit' from source: magic vars 10215 1727204052.54961: starting attempt loop 10215 1727204052.54964: running the handler 10215 1727204052.55131: variable 'interface_stat' from source: set_fact 10215 1727204052.55161: Evaluated conditional (interface_stat.stat.exists): True 10215 1727204052.55193: handler run complete 10215 1727204052.55212: attempt loop complete, returning result 10215 1727204052.55299: _execute() done 10215 1727204052.55308: dumping result to json 10215 1727204052.55311: done dumping result, returning 10215 1727204052.55313: done running TaskExecutor() for managed-node3/TASK: Assert that the interface is present - 'nm-bond' [12b410aa-8751-3c74-8f8e-00000000006f] 10215 1727204052.55316: sending task result for task 12b410aa-8751-3c74-8f8e-00000000006f 10215 1727204052.55395: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000006f 10215 1727204052.55399: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10215 1727204052.55555: no more pending results, returning what we have 10215 1727204052.55559: results queue empty 10215 1727204052.55560: checking for any_errors_fatal 10215 1727204052.55572: done checking for any_errors_fatal 10215 1727204052.55573: checking for max_fail_percentage 10215 1727204052.55575: done checking for max_fail_percentage 10215 1727204052.55576: checking to see if all hosts have failed and the running result is not ok 10215 1727204052.55578: done checking to see if all hosts have failed 10215 1727204052.55579: getting the remaining hosts for this loop 10215 1727204052.55580: done getting the remaining hosts for this loop 10215 1727204052.55586: getting the next task for host managed-node3 10215 1727204052.55599: done getting next task for host managed-node3 10215 1727204052.55604: ^ task is: TASK: Include the task 'assert_profile_present.yml' 10215 1727204052.55606: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204052.55611: getting variables 10215 1727204052.55614: in VariableManager get_vars() 10215 1727204052.55665: Calling all_inventory to load vars for managed-node3 10215 1727204052.55670: Calling groups_inventory to load vars for managed-node3 10215 1727204052.55674: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204052.55805: Calling all_plugins_play to load vars for managed-node3 10215 1727204052.55812: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204052.55817: Calling groups_plugins_play to load vars for managed-node3 10215 1727204052.58037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204052.59674: done with get_vars() 10215 1727204052.59708: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:67 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.075) 0:00:21.165 ***** 10215 1727204052.59818: entering _queue_task() for managed-node3/include_tasks 10215 1727204052.60161: worker is 1 (out of 1 available) 10215 1727204052.60175: exiting _queue_task() for managed-node3/include_tasks 10215 1727204052.60187: done queuing things up, now waiting for results queue to drain 10215 1727204052.60188: waiting for pending results... 10215 1727204052.60494: running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' 10215 1727204052.60621: in run() - task 12b410aa-8751-3c74-8f8e-000000000070 10215 1727204052.60646: variable 'ansible_search_path' from source: unknown 10215 1727204052.60714: variable 'controller_profile' from source: play vars 10215 1727204052.60882: variable 'controller_profile' from source: play vars 10215 1727204052.60895: variable 'port1_profile' from source: play vars 10215 1727204052.60955: variable 'port1_profile' from source: play vars 10215 1727204052.60959: variable 'port2_profile' from source: play vars 10215 1727204052.61015: variable 'port2_profile' from source: play vars 10215 1727204052.61027: variable 'omit' from source: magic vars 10215 1727204052.61144: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.61159: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.61171: variable 'omit' from source: magic vars 10215 1727204052.61378: variable 'ansible_distribution_major_version' from source: facts 10215 1727204052.61388: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204052.61418: variable 'item' from source: unknown 10215 1727204052.61467: variable 'item' from source: unknown 10215 1727204052.61592: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.61595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.61598: variable 'omit' from source: magic vars 10215 1727204052.61721: variable 'ansible_distribution_major_version' from source: facts 10215 1727204052.61726: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204052.61748: variable 'item' from source: unknown 10215 1727204052.61801: variable 'item' from source: unknown 10215 1727204052.61888: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.61893: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.61901: variable 'omit' from source: magic vars 10215 1727204052.62026: variable 'ansible_distribution_major_version' from source: facts 10215 1727204052.62030: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204052.62057: variable 'item' from source: unknown 10215 1727204052.62110: variable 'item' from source: unknown 10215 1727204052.62174: dumping result to json 10215 1727204052.62178: done dumping result, returning 10215 1727204052.62181: done running TaskExecutor() for managed-node3/TASK: Include the task 'assert_profile_present.yml' [12b410aa-8751-3c74-8f8e-000000000070] 10215 1727204052.62183: sending task result for task 12b410aa-8751-3c74-8f8e-000000000070 10215 1727204052.62233: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000070 10215 1727204052.62236: WORKER PROCESS EXITING 10215 1727204052.62276: no more pending results, returning what we have 10215 1727204052.62281: in VariableManager get_vars() 10215 1727204052.62332: Calling all_inventory to load vars for managed-node3 10215 1727204052.62336: Calling groups_inventory to load vars for managed-node3 10215 1727204052.62340: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204052.62361: Calling all_plugins_play to load vars for managed-node3 10215 1727204052.62364: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204052.62368: Calling groups_plugins_play to load vars for managed-node3 10215 1727204052.64450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204052.66110: done with get_vars() 10215 1727204052.66147: variable 'ansible_search_path' from source: unknown 10215 1727204052.66166: variable 'ansible_search_path' from source: unknown 10215 1727204052.66176: variable 'ansible_search_path' from source: unknown 10215 1727204052.66184: we have included files to process 10215 1727204052.66185: generating all_blocks data 10215 1727204052.66187: done generating all_blocks data 10215 1727204052.66194: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10215 1727204052.66195: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10215 1727204052.66198: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10215 1727204052.66456: in VariableManager get_vars() 10215 1727204052.66486: done with get_vars() 10215 1727204052.66816: done processing included file 10215 1727204052.66819: iterating over new_blocks loaded from include file 10215 1727204052.66820: in VariableManager get_vars() 10215 1727204052.66850: done with get_vars() 10215 1727204052.66855: filtering new block on tags 10215 1727204052.66884: done filtering new block on tags 10215 1727204052.66888: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 => (item=bond0) 10215 1727204052.66895: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10215 1727204052.66897: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10215 1727204052.66900: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10215 1727204052.67019: in VariableManager get_vars() 10215 1727204052.67048: done with get_vars() 10215 1727204052.67331: done processing included file 10215 1727204052.67333: iterating over new_blocks loaded from include file 10215 1727204052.67334: in VariableManager get_vars() 10215 1727204052.67357: done with get_vars() 10215 1727204052.67359: filtering new block on tags 10215 1727204052.67382: done filtering new block on tags 10215 1727204052.67385: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 => (item=bond0.0) 10215 1727204052.67391: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10215 1727204052.67392: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10215 1727204052.67395: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 10215 1727204052.67625: in VariableManager get_vars() 10215 1727204052.67652: done with get_vars() 10215 1727204052.67944: done processing included file 10215 1727204052.67947: iterating over new_blocks loaded from include file 10215 1727204052.67948: in VariableManager get_vars() 10215 1727204052.67971: done with get_vars() 10215 1727204052.67973: filtering new block on tags 10215 1727204052.67998: done filtering new block on tags 10215 1727204052.68001: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node3 => (item=bond0.1) 10215 1727204052.68006: extending task lists for all hosts with included blocks 10215 1727204052.70363: done extending task lists 10215 1727204052.70371: done processing included files 10215 1727204052.70372: results queue empty 10215 1727204052.70373: checking for any_errors_fatal 10215 1727204052.70377: done checking for any_errors_fatal 10215 1727204052.70378: checking for max_fail_percentage 10215 1727204052.70379: done checking for max_fail_percentage 10215 1727204052.70380: checking to see if all hosts have failed and the running result is not ok 10215 1727204052.70381: done checking to see if all hosts have failed 10215 1727204052.70382: getting the remaining hosts for this loop 10215 1727204052.70383: done getting the remaining hosts for this loop 10215 1727204052.70386: getting the next task for host managed-node3 10215 1727204052.70393: done getting next task for host managed-node3 10215 1727204052.70396: ^ task is: TASK: Include the task 'get_profile_stat.yml' 10215 1727204052.70399: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204052.70402: getting variables 10215 1727204052.70403: in VariableManager get_vars() 10215 1727204052.70425: Calling all_inventory to load vars for managed-node3 10215 1727204052.70428: Calling groups_inventory to load vars for managed-node3 10215 1727204052.70431: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204052.70439: Calling all_plugins_play to load vars for managed-node3 10215 1727204052.70442: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204052.70446: Calling groups_plugins_play to load vars for managed-node3 10215 1727204052.71973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204052.73642: done with get_vars() 10215 1727204052.73664: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.139) 0:00:21.304 ***** 10215 1727204052.73735: entering _queue_task() for managed-node3/include_tasks 10215 1727204052.74027: worker is 1 (out of 1 available) 10215 1727204052.74042: exiting _queue_task() for managed-node3/include_tasks 10215 1727204052.74056: done queuing things up, now waiting for results queue to drain 10215 1727204052.74058: waiting for pending results... 10215 1727204052.74247: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 10215 1727204052.74329: in run() - task 12b410aa-8751-3c74-8f8e-00000000025f 10215 1727204052.74343: variable 'ansible_search_path' from source: unknown 10215 1727204052.74346: variable 'ansible_search_path' from source: unknown 10215 1727204052.74381: calling self._execute() 10215 1727204052.74465: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.74473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.74484: variable 'omit' from source: magic vars 10215 1727204052.74820: variable 'ansible_distribution_major_version' from source: facts 10215 1727204052.74833: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204052.74841: _execute() done 10215 1727204052.74844: dumping result to json 10215 1727204052.74847: done dumping result, returning 10215 1727204052.74857: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-3c74-8f8e-00000000025f] 10215 1727204052.74862: sending task result for task 12b410aa-8751-3c74-8f8e-00000000025f 10215 1727204052.74961: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000025f 10215 1727204052.74964: WORKER PROCESS EXITING 10215 1727204052.74997: no more pending results, returning what we have 10215 1727204052.75003: in VariableManager get_vars() 10215 1727204052.75054: Calling all_inventory to load vars for managed-node3 10215 1727204052.75058: Calling groups_inventory to load vars for managed-node3 10215 1727204052.75060: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204052.75076: Calling all_plugins_play to load vars for managed-node3 10215 1727204052.75079: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204052.75082: Calling groups_plugins_play to load vars for managed-node3 10215 1727204052.76343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204052.77911: done with get_vars() 10215 1727204052.77932: variable 'ansible_search_path' from source: unknown 10215 1727204052.77933: variable 'ansible_search_path' from source: unknown 10215 1727204052.77964: we have included files to process 10215 1727204052.77965: generating all_blocks data 10215 1727204052.77966: done generating all_blocks data 10215 1727204052.77968: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10215 1727204052.77968: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10215 1727204052.77970: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10215 1727204052.78832: done processing included file 10215 1727204052.78834: iterating over new_blocks loaded from include file 10215 1727204052.78835: in VariableManager get_vars() 10215 1727204052.78850: done with get_vars() 10215 1727204052.78852: filtering new block on tags 10215 1727204052.78871: done filtering new block on tags 10215 1727204052.78873: in VariableManager get_vars() 10215 1727204052.78887: done with get_vars() 10215 1727204052.78890: filtering new block on tags 10215 1727204052.78911: done filtering new block on tags 10215 1727204052.78913: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 10215 1727204052.78917: extending task lists for all hosts with included blocks 10215 1727204052.79054: done extending task lists 10215 1727204052.79055: done processing included files 10215 1727204052.79055: results queue empty 10215 1727204052.79056: checking for any_errors_fatal 10215 1727204052.79058: done checking for any_errors_fatal 10215 1727204052.79059: checking for max_fail_percentage 10215 1727204052.79060: done checking for max_fail_percentage 10215 1727204052.79060: checking to see if all hosts have failed and the running result is not ok 10215 1727204052.79061: done checking to see if all hosts have failed 10215 1727204052.79061: getting the remaining hosts for this loop 10215 1727204052.79062: done getting the remaining hosts for this loop 10215 1727204052.79064: getting the next task for host managed-node3 10215 1727204052.79067: done getting next task for host managed-node3 10215 1727204052.79069: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 10215 1727204052.79071: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204052.79072: getting variables 10215 1727204052.79073: in VariableManager get_vars() 10215 1727204052.79137: Calling all_inventory to load vars for managed-node3 10215 1727204052.79141: Calling groups_inventory to load vars for managed-node3 10215 1727204052.79142: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204052.79147: Calling all_plugins_play to load vars for managed-node3 10215 1727204052.79149: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204052.79151: Calling groups_plugins_play to load vars for managed-node3 10215 1727204052.80222: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204052.81769: done with get_vars() 10215 1727204052.81792: done getting variables 10215 1727204052.81831: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.081) 0:00:21.385 ***** 10215 1727204052.81854: entering _queue_task() for managed-node3/set_fact 10215 1727204052.82129: worker is 1 (out of 1 available) 10215 1727204052.82143: exiting _queue_task() for managed-node3/set_fact 10215 1727204052.82156: done queuing things up, now waiting for results queue to drain 10215 1727204052.82158: waiting for pending results... 10215 1727204052.82346: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 10215 1727204052.82435: in run() - task 12b410aa-8751-3c74-8f8e-0000000003b0 10215 1727204052.82448: variable 'ansible_search_path' from source: unknown 10215 1727204052.82452: variable 'ansible_search_path' from source: unknown 10215 1727204052.82484: calling self._execute() 10215 1727204052.82562: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.82569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.82579: variable 'omit' from source: magic vars 10215 1727204052.82900: variable 'ansible_distribution_major_version' from source: facts 10215 1727204052.82912: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204052.82920: variable 'omit' from source: magic vars 10215 1727204052.82960: variable 'omit' from source: magic vars 10215 1727204052.82991: variable 'omit' from source: magic vars 10215 1727204052.83027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204052.83061: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204052.83079: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204052.83098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204052.83111: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204052.83137: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204052.83141: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.83149: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.83230: Set connection var ansible_connection to ssh 10215 1727204052.83240: Set connection var ansible_pipelining to False 10215 1727204052.83247: Set connection var ansible_shell_type to sh 10215 1727204052.83254: Set connection var ansible_timeout to 10 10215 1727204052.83261: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204052.83271: Set connection var ansible_shell_executable to /bin/sh 10215 1727204052.83291: variable 'ansible_shell_executable' from source: unknown 10215 1727204052.83295: variable 'ansible_connection' from source: unknown 10215 1727204052.83298: variable 'ansible_module_compression' from source: unknown 10215 1727204052.83301: variable 'ansible_shell_type' from source: unknown 10215 1727204052.83305: variable 'ansible_shell_executable' from source: unknown 10215 1727204052.83311: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.83314: variable 'ansible_pipelining' from source: unknown 10215 1727204052.83319: variable 'ansible_timeout' from source: unknown 10215 1727204052.83324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.83443: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204052.83454: variable 'omit' from source: magic vars 10215 1727204052.83459: starting attempt loop 10215 1727204052.83462: running the handler 10215 1727204052.83478: handler run complete 10215 1727204052.83491: attempt loop complete, returning result 10215 1727204052.83497: _execute() done 10215 1727204052.83500: dumping result to json 10215 1727204052.83502: done dumping result, returning 10215 1727204052.83505: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-3c74-8f8e-0000000003b0] 10215 1727204052.83513: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b0 10215 1727204052.83596: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b0 10215 1727204052.83599: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 10215 1727204052.83660: no more pending results, returning what we have 10215 1727204052.83664: results queue empty 10215 1727204052.83665: checking for any_errors_fatal 10215 1727204052.83667: done checking for any_errors_fatal 10215 1727204052.83668: checking for max_fail_percentage 10215 1727204052.83669: done checking for max_fail_percentage 10215 1727204052.83670: checking to see if all hosts have failed and the running result is not ok 10215 1727204052.83671: done checking to see if all hosts have failed 10215 1727204052.83673: getting the remaining hosts for this loop 10215 1727204052.83674: done getting the remaining hosts for this loop 10215 1727204052.83679: getting the next task for host managed-node3 10215 1727204052.83686: done getting next task for host managed-node3 10215 1727204052.83688: ^ task is: TASK: Stat profile file 10215 1727204052.83694: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204052.83697: getting variables 10215 1727204052.83699: in VariableManager get_vars() 10215 1727204052.83740: Calling all_inventory to load vars for managed-node3 10215 1727204052.83744: Calling groups_inventory to load vars for managed-node3 10215 1727204052.83746: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204052.83757: Calling all_plugins_play to load vars for managed-node3 10215 1727204052.83760: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204052.83763: Calling groups_plugins_play to load vars for managed-node3 10215 1727204052.85050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204052.86614: done with get_vars() 10215 1727204052.86635: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.048) 0:00:21.434 ***** 10215 1727204052.86707: entering _queue_task() for managed-node3/stat 10215 1727204052.86929: worker is 1 (out of 1 available) 10215 1727204052.86944: exiting _queue_task() for managed-node3/stat 10215 1727204052.86956: done queuing things up, now waiting for results queue to drain 10215 1727204052.86958: waiting for pending results... 10215 1727204052.87146: running TaskExecutor() for managed-node3/TASK: Stat profile file 10215 1727204052.87230: in run() - task 12b410aa-8751-3c74-8f8e-0000000003b1 10215 1727204052.87242: variable 'ansible_search_path' from source: unknown 10215 1727204052.87246: variable 'ansible_search_path' from source: unknown 10215 1727204052.87278: calling self._execute() 10215 1727204052.87356: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.87363: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.87373: variable 'omit' from source: magic vars 10215 1727204052.87697: variable 'ansible_distribution_major_version' from source: facts 10215 1727204052.87708: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204052.87717: variable 'omit' from source: magic vars 10215 1727204052.87759: variable 'omit' from source: magic vars 10215 1727204052.87842: variable 'profile' from source: include params 10215 1727204052.87847: variable 'item' from source: include params 10215 1727204052.87911: variable 'item' from source: include params 10215 1727204052.87927: variable 'omit' from source: magic vars 10215 1727204052.87966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204052.87999: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204052.88019: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204052.88035: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204052.88046: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204052.88076: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204052.88080: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.88085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.88168: Set connection var ansible_connection to ssh 10215 1727204052.88171: Set connection var ansible_pipelining to False 10215 1727204052.88182: Set connection var ansible_shell_type to sh 10215 1727204052.88185: Set connection var ansible_timeout to 10 10215 1727204052.88195: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204052.88204: Set connection var ansible_shell_executable to /bin/sh 10215 1727204052.88225: variable 'ansible_shell_executable' from source: unknown 10215 1727204052.88228: variable 'ansible_connection' from source: unknown 10215 1727204052.88234: variable 'ansible_module_compression' from source: unknown 10215 1727204052.88237: variable 'ansible_shell_type' from source: unknown 10215 1727204052.88239: variable 'ansible_shell_executable' from source: unknown 10215 1727204052.88241: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204052.88247: variable 'ansible_pipelining' from source: unknown 10215 1727204052.88250: variable 'ansible_timeout' from source: unknown 10215 1727204052.88256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204052.88433: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204052.88442: variable 'omit' from source: magic vars 10215 1727204052.88449: starting attempt loop 10215 1727204052.88451: running the handler 10215 1727204052.88465: _low_level_execute_command(): starting 10215 1727204052.88472: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204052.88991: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204052.89028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204052.89031: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.89034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204052.89038: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.89098: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204052.89101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204052.89146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204052.90896: stdout chunk (state=3): >>>/root <<< 10215 1727204052.90999: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204052.91056: stderr chunk (state=3): >>><<< 10215 1727204052.91059: stdout chunk (state=3): >>><<< 10215 1727204052.91080: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204052.91093: _low_level_execute_command(): starting 10215 1727204052.91100: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944 `" && echo ansible-tmp-1727204052.910804-11513-60556538889944="` echo /root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944 `" ) && sleep 0' 10215 1727204052.91566: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204052.91570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.91582: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204052.91585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.91635: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204052.91638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204052.91681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204052.93639: stdout chunk (state=3): >>>ansible-tmp-1727204052.910804-11513-60556538889944=/root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944 <<< 10215 1727204052.93756: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204052.93818: stderr chunk (state=3): >>><<< 10215 1727204052.93822: stdout chunk (state=3): >>><<< 10215 1727204052.93840: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204052.910804-11513-60556538889944=/root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204052.93884: variable 'ansible_module_compression' from source: unknown 10215 1727204052.93936: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10215 1727204052.93971: variable 'ansible_facts' from source: unknown 10215 1727204052.94032: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944/AnsiballZ_stat.py 10215 1727204052.94149: Sending initial data 10215 1727204052.94152: Sent initial data (151 bytes) 10215 1727204052.94583: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204052.94619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204052.94623: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.94625: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204052.94628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.94680: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204052.94684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204052.94726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204052.96331: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204052.96371: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204052.96428: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpopzw9muc /root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944/AnsiballZ_stat.py <<< 10215 1727204052.96431: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944/AnsiballZ_stat.py" <<< 10215 1727204052.96466: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpopzw9muc" to remote "/root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944/AnsiballZ_stat.py" <<< 10215 1727204052.97587: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204052.97637: stderr chunk (state=3): >>><<< 10215 1727204052.97649: stdout chunk (state=3): >>><<< 10215 1727204052.97685: done transferring module to remote 10215 1727204052.97713: _low_level_execute_command(): starting 10215 1727204052.97724: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944/ /root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944/AnsiballZ_stat.py && sleep 0' 10215 1727204052.98425: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204052.98471: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204052.98474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204052.98550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.98569: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204052.98626: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204052.98644: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204052.98668: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204052.98741: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204053.00750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204053.00754: stdout chunk (state=3): >>><<< 10215 1727204053.00757: stderr chunk (state=3): >>><<< 10215 1727204053.00777: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204053.00787: _low_level_execute_command(): starting 10215 1727204053.00801: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944/AnsiballZ_stat.py && sleep 0' 10215 1727204053.01467: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204053.01484: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204053.01506: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204053.01530: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204053.01551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204053.01568: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204053.01584: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204053.01698: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204053.01726: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204053.01810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204053.19313: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10215 1727204053.20791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204053.20851: stderr chunk (state=3): >>><<< 10215 1727204053.20855: stdout chunk (state=3): >>><<< 10215 1727204053.20871: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204053.20902: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204053.20920: _low_level_execute_command(): starting 10215 1727204053.20925: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204052.910804-11513-60556538889944/ > /dev/null 2>&1 && sleep 0' 10215 1727204053.21383: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204053.21395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204053.21421: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204053.21424: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204053.21492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204053.21501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204053.21503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204053.21529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204053.23468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204053.23519: stderr chunk (state=3): >>><<< 10215 1727204053.23524: stdout chunk (state=3): >>><<< 10215 1727204053.23537: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204053.23546: handler run complete 10215 1727204053.23565: attempt loop complete, returning result 10215 1727204053.23568: _execute() done 10215 1727204053.23571: dumping result to json 10215 1727204053.23576: done dumping result, returning 10215 1727204053.23585: done running TaskExecutor() for managed-node3/TASK: Stat profile file [12b410aa-8751-3c74-8f8e-0000000003b1] 10215 1727204053.23592: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b1 10215 1727204053.23695: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b1 10215 1727204053.23698: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 10215 1727204053.23768: no more pending results, returning what we have 10215 1727204053.23772: results queue empty 10215 1727204053.23773: checking for any_errors_fatal 10215 1727204053.23780: done checking for any_errors_fatal 10215 1727204053.23780: checking for max_fail_percentage 10215 1727204053.23782: done checking for max_fail_percentage 10215 1727204053.23783: checking to see if all hosts have failed and the running result is not ok 10215 1727204053.23784: done checking to see if all hosts have failed 10215 1727204053.23785: getting the remaining hosts for this loop 10215 1727204053.23787: done getting the remaining hosts for this loop 10215 1727204053.23794: getting the next task for host managed-node3 10215 1727204053.23802: done getting next task for host managed-node3 10215 1727204053.23804: ^ task is: TASK: Set NM profile exist flag based on the profile files 10215 1727204053.23810: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204053.23814: getting variables 10215 1727204053.23816: in VariableManager get_vars() 10215 1727204053.23859: Calling all_inventory to load vars for managed-node3 10215 1727204053.23862: Calling groups_inventory to load vars for managed-node3 10215 1727204053.23864: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204053.23876: Calling all_plugins_play to load vars for managed-node3 10215 1727204053.23879: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204053.23883: Calling groups_plugins_play to load vars for managed-node3 10215 1727204053.25118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204053.26678: done with get_vars() 10215 1727204053.26702: done getting variables 10215 1727204053.26757: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.400) 0:00:21.835 ***** 10215 1727204053.26782: entering _queue_task() for managed-node3/set_fact 10215 1727204053.27030: worker is 1 (out of 1 available) 10215 1727204053.27045: exiting _queue_task() for managed-node3/set_fact 10215 1727204053.27060: done queuing things up, now waiting for results queue to drain 10215 1727204053.27062: waiting for pending results... 10215 1727204053.27245: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 10215 1727204053.27334: in run() - task 12b410aa-8751-3c74-8f8e-0000000003b2 10215 1727204053.27346: variable 'ansible_search_path' from source: unknown 10215 1727204053.27350: variable 'ansible_search_path' from source: unknown 10215 1727204053.27383: calling self._execute() 10215 1727204053.27458: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204053.27466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204053.27476: variable 'omit' from source: magic vars 10215 1727204053.27799: variable 'ansible_distribution_major_version' from source: facts 10215 1727204053.27813: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204053.27916: variable 'profile_stat' from source: set_fact 10215 1727204053.27928: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204053.27932: when evaluation is False, skipping this task 10215 1727204053.27935: _execute() done 10215 1727204053.27940: dumping result to json 10215 1727204053.27943: done dumping result, returning 10215 1727204053.27955: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-3c74-8f8e-0000000003b2] 10215 1727204053.27958: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b2 10215 1727204053.28048: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b2 10215 1727204053.28051: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204053.28113: no more pending results, returning what we have 10215 1727204053.28117: results queue empty 10215 1727204053.28118: checking for any_errors_fatal 10215 1727204053.28125: done checking for any_errors_fatal 10215 1727204053.28126: checking for max_fail_percentage 10215 1727204053.28128: done checking for max_fail_percentage 10215 1727204053.28129: checking to see if all hosts have failed and the running result is not ok 10215 1727204053.28130: done checking to see if all hosts have failed 10215 1727204053.28131: getting the remaining hosts for this loop 10215 1727204053.28133: done getting the remaining hosts for this loop 10215 1727204053.28136: getting the next task for host managed-node3 10215 1727204053.28142: done getting next task for host managed-node3 10215 1727204053.28145: ^ task is: TASK: Get NM profile info 10215 1727204053.28149: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204053.28153: getting variables 10215 1727204053.28154: in VariableManager get_vars() 10215 1727204053.28193: Calling all_inventory to load vars for managed-node3 10215 1727204053.28197: Calling groups_inventory to load vars for managed-node3 10215 1727204053.28200: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204053.28212: Calling all_plugins_play to load vars for managed-node3 10215 1727204053.28216: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204053.28219: Calling groups_plugins_play to load vars for managed-node3 10215 1727204053.29477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204053.31011: done with get_vars() 10215 1727204053.31033: done getting variables 10215 1727204053.31083: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.043) 0:00:21.878 ***** 10215 1727204053.31111: entering _queue_task() for managed-node3/shell 10215 1727204053.31340: worker is 1 (out of 1 available) 10215 1727204053.31355: exiting _queue_task() for managed-node3/shell 10215 1727204053.31366: done queuing things up, now waiting for results queue to drain 10215 1727204053.31368: waiting for pending results... 10215 1727204053.31560: running TaskExecutor() for managed-node3/TASK: Get NM profile info 10215 1727204053.31648: in run() - task 12b410aa-8751-3c74-8f8e-0000000003b3 10215 1727204053.31660: variable 'ansible_search_path' from source: unknown 10215 1727204053.31664: variable 'ansible_search_path' from source: unknown 10215 1727204053.31697: calling self._execute() 10215 1727204053.31775: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204053.31781: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204053.31793: variable 'omit' from source: magic vars 10215 1727204053.32114: variable 'ansible_distribution_major_version' from source: facts 10215 1727204053.32125: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204053.32132: variable 'omit' from source: magic vars 10215 1727204053.32174: variable 'omit' from source: magic vars 10215 1727204053.32262: variable 'profile' from source: include params 10215 1727204053.32266: variable 'item' from source: include params 10215 1727204053.32321: variable 'item' from source: include params 10215 1727204053.32338: variable 'omit' from source: magic vars 10215 1727204053.32378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204053.32415: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204053.32432: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204053.32448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204053.32459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204053.32492: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204053.32496: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204053.32499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204053.32580: Set connection var ansible_connection to ssh 10215 1727204053.32588: Set connection var ansible_pipelining to False 10215 1727204053.32596: Set connection var ansible_shell_type to sh 10215 1727204053.32603: Set connection var ansible_timeout to 10 10215 1727204053.32612: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204053.32621: Set connection var ansible_shell_executable to /bin/sh 10215 1727204053.32640: variable 'ansible_shell_executable' from source: unknown 10215 1727204053.32643: variable 'ansible_connection' from source: unknown 10215 1727204053.32645: variable 'ansible_module_compression' from source: unknown 10215 1727204053.32650: variable 'ansible_shell_type' from source: unknown 10215 1727204053.32652: variable 'ansible_shell_executable' from source: unknown 10215 1727204053.32657: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204053.32662: variable 'ansible_pipelining' from source: unknown 10215 1727204053.32665: variable 'ansible_timeout' from source: unknown 10215 1727204053.32670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204053.32815: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204053.32825: variable 'omit' from source: magic vars 10215 1727204053.32830: starting attempt loop 10215 1727204053.32833: running the handler 10215 1727204053.32844: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204053.32861: _low_level_execute_command(): starting 10215 1727204053.32869: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204053.33393: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204053.33426: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204053.33430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204053.33432: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204053.33486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204053.33490: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204053.33548: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204053.35273: stdout chunk (state=3): >>>/root <<< 10215 1727204053.35397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204053.35443: stderr chunk (state=3): >>><<< 10215 1727204053.35449: stdout chunk (state=3): >>><<< 10215 1727204053.35474: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204053.35485: _low_level_execute_command(): starting 10215 1727204053.35493: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280 `" && echo ansible-tmp-1727204053.3547306-11530-268931654351280="` echo /root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280 `" ) && sleep 0' 10215 1727204053.35944: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204053.35954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204053.35958: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204053.35961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204053.35995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204053.36020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204053.36049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204053.38011: stdout chunk (state=3): >>>ansible-tmp-1727204053.3547306-11530-268931654351280=/root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280 <<< 10215 1727204053.38146: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204053.38245: stderr chunk (state=3): >>><<< 10215 1727204053.38249: stdout chunk (state=3): >>><<< 10215 1727204053.38252: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204053.3547306-11530-268931654351280=/root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204053.38284: variable 'ansible_module_compression' from source: unknown 10215 1727204053.38344: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10215 1727204053.38400: variable 'ansible_facts' from source: unknown 10215 1727204053.38573: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280/AnsiballZ_command.py 10215 1727204053.38708: Sending initial data 10215 1727204053.38711: Sent initial data (156 bytes) 10215 1727204053.39347: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204053.39418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204053.39454: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204053.39496: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204053.39527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204053.41144: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204053.41215: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204053.41260: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmp7yxhlroz /root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280/AnsiballZ_command.py <<< 10215 1727204053.41263: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280/AnsiballZ_command.py" <<< 10215 1727204053.41300: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmp7yxhlroz" to remote "/root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280/AnsiballZ_command.py" <<< 10215 1727204053.42298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204053.42391: stderr chunk (state=3): >>><<< 10215 1727204053.42404: stdout chunk (state=3): >>><<< 10215 1727204053.42440: done transferring module to remote 10215 1727204053.42482: _low_level_execute_command(): starting 10215 1727204053.42485: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280/ /root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280/AnsiballZ_command.py && sleep 0' 10215 1727204053.43260: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204053.43287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204053.43306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204053.43337: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204053.43404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204053.45342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204053.45345: stdout chunk (state=3): >>><<< 10215 1727204053.45348: stderr chunk (state=3): >>><<< 10215 1727204053.45366: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204053.45379: _low_level_execute_command(): starting 10215 1727204053.45392: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280/AnsiballZ_command.py && sleep 0' 10215 1727204053.46005: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204053.46023: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204053.46048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204053.46069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204053.46088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204053.46117: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204053.46161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204053.46239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204053.46274: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204053.46300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204053.46397: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204053.68226: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:13.657804", "end": "2024-09-24 14:54:13.680883", "delta": "0:00:00.023079", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10215 1727204053.69981: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204053.70014: stdout chunk (state=3): >>><<< 10215 1727204053.70017: stderr chunk (state=3): >>><<< 10215 1727204053.70039: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:13.657804", "end": "2024-09-24 14:54:13.680883", "delta": "0:00:00.023079", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204053.70164: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204053.70167: _low_level_execute_command(): starting 10215 1727204053.70170: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204053.3547306-11530-268931654351280/ > /dev/null 2>&1 && sleep 0' 10215 1727204053.70879: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204053.71002: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204053.71054: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204053.71122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204053.73718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204053.73779: stderr chunk (state=3): >>><<< 10215 1727204053.73797: stdout chunk (state=3): >>><<< 10215 1727204053.73995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204053.73998: handler run complete 10215 1727204053.74001: Evaluated conditional (False): False 10215 1727204053.74003: attempt loop complete, returning result 10215 1727204053.74005: _execute() done 10215 1727204053.74010: dumping result to json 10215 1727204053.74012: done dumping result, returning 10215 1727204053.74014: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [12b410aa-8751-3c74-8f8e-0000000003b3] 10215 1727204053.74016: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b3 10215 1727204053.74093: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b3 10215 1727204053.74096: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.023079", "end": "2024-09-24 14:54:13.680883", "rc": 0, "start": "2024-09-24 14:54:13.657804" } STDOUT: bond0 /etc/NetworkManager/system-connections/bond0.nmconnection bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 10215 1727204053.74197: no more pending results, returning what we have 10215 1727204053.74201: results queue empty 10215 1727204053.74203: checking for any_errors_fatal 10215 1727204053.74214: done checking for any_errors_fatal 10215 1727204053.74215: checking for max_fail_percentage 10215 1727204053.74217: done checking for max_fail_percentage 10215 1727204053.74218: checking to see if all hosts have failed and the running result is not ok 10215 1727204053.74219: done checking to see if all hosts have failed 10215 1727204053.74220: getting the remaining hosts for this loop 10215 1727204053.74222: done getting the remaining hosts for this loop 10215 1727204053.74227: getting the next task for host managed-node3 10215 1727204053.74235: done getting next task for host managed-node3 10215 1727204053.74238: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10215 1727204053.74243: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204053.74246: getting variables 10215 1727204053.74248: in VariableManager get_vars() 10215 1727204053.74413: Calling all_inventory to load vars for managed-node3 10215 1727204053.74417: Calling groups_inventory to load vars for managed-node3 10215 1727204053.74421: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204053.74435: Calling all_plugins_play to load vars for managed-node3 10215 1727204053.74439: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204053.74443: Calling groups_plugins_play to load vars for managed-node3 10215 1727204053.77067: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204053.79792: done with get_vars() 10215 1727204053.79821: done getting variables 10215 1727204053.79873: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.487) 0:00:22.366 ***** 10215 1727204053.79904: entering _queue_task() for managed-node3/set_fact 10215 1727204053.80186: worker is 1 (out of 1 available) 10215 1727204053.80203: exiting _queue_task() for managed-node3/set_fact 10215 1727204053.80218: done queuing things up, now waiting for results queue to drain 10215 1727204053.80220: waiting for pending results... 10215 1727204053.80415: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10215 1727204053.80504: in run() - task 12b410aa-8751-3c74-8f8e-0000000003b4 10215 1727204053.80517: variable 'ansible_search_path' from source: unknown 10215 1727204053.80521: variable 'ansible_search_path' from source: unknown 10215 1727204053.80554: calling self._execute() 10215 1727204053.80637: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204053.80644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204053.80654: variable 'omit' from source: magic vars 10215 1727204053.80968: variable 'ansible_distribution_major_version' from source: facts 10215 1727204053.80978: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204053.81093: variable 'nm_profile_exists' from source: set_fact 10215 1727204053.81114: Evaluated conditional (nm_profile_exists.rc == 0): True 10215 1727204053.81118: variable 'omit' from source: magic vars 10215 1727204053.81174: variable 'omit' from source: magic vars 10215 1727204053.81203: variable 'omit' from source: magic vars 10215 1727204053.81345: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204053.81349: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204053.81352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204053.81354: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204053.81357: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204053.81419: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204053.81423: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204053.81426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204053.81583: Set connection var ansible_connection to ssh 10215 1727204053.81587: Set connection var ansible_pipelining to False 10215 1727204053.81592: Set connection var ansible_shell_type to sh 10215 1727204053.81594: Set connection var ansible_timeout to 10 10215 1727204053.81597: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204053.81600: Set connection var ansible_shell_executable to /bin/sh 10215 1727204053.81603: variable 'ansible_shell_executable' from source: unknown 10215 1727204053.81605: variable 'ansible_connection' from source: unknown 10215 1727204053.81610: variable 'ansible_module_compression' from source: unknown 10215 1727204053.81612: variable 'ansible_shell_type' from source: unknown 10215 1727204053.81614: variable 'ansible_shell_executable' from source: unknown 10215 1727204053.81616: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204053.81618: variable 'ansible_pipelining' from source: unknown 10215 1727204053.81620: variable 'ansible_timeout' from source: unknown 10215 1727204053.81623: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204053.81782: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204053.81804: variable 'omit' from source: magic vars 10215 1727204053.81810: starting attempt loop 10215 1727204053.81813: running the handler 10215 1727204053.81821: handler run complete 10215 1727204053.81834: attempt loop complete, returning result 10215 1727204053.81837: _execute() done 10215 1727204053.81839: dumping result to json 10215 1727204053.81845: done dumping result, returning 10215 1727204053.81855: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-3c74-8f8e-0000000003b4] 10215 1727204053.81900: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b4 ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 10215 1727204053.82098: no more pending results, returning what we have 10215 1727204053.82101: results queue empty 10215 1727204053.82102: checking for any_errors_fatal 10215 1727204053.82111: done checking for any_errors_fatal 10215 1727204053.82112: checking for max_fail_percentage 10215 1727204053.82114: done checking for max_fail_percentage 10215 1727204053.82115: checking to see if all hosts have failed and the running result is not ok 10215 1727204053.82115: done checking to see if all hosts have failed 10215 1727204053.82116: getting the remaining hosts for this loop 10215 1727204053.82118: done getting the remaining hosts for this loop 10215 1727204053.82121: getting the next task for host managed-node3 10215 1727204053.82130: done getting next task for host managed-node3 10215 1727204053.82133: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 10215 1727204053.82137: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204053.82143: getting variables 10215 1727204053.82144: in VariableManager get_vars() 10215 1727204053.82180: Calling all_inventory to load vars for managed-node3 10215 1727204053.82183: Calling groups_inventory to load vars for managed-node3 10215 1727204053.82186: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204053.82198: Calling all_plugins_play to load vars for managed-node3 10215 1727204053.82202: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204053.82205: Calling groups_plugins_play to load vars for managed-node3 10215 1727204053.82729: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b4 10215 1727204053.82732: WORKER PROCESS EXITING 10215 1727204053.83669: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204053.85613: done with get_vars() 10215 1727204053.85653: done getting variables 10215 1727204053.85733: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204053.85874: variable 'profile' from source: include params 10215 1727204053.85879: variable 'item' from source: include params 10215 1727204053.85957: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.060) 0:00:22.427 ***** 10215 1727204053.86000: entering _queue_task() for managed-node3/command 10215 1727204053.86366: worker is 1 (out of 1 available) 10215 1727204053.86383: exiting _queue_task() for managed-node3/command 10215 1727204053.86398: done queuing things up, now waiting for results queue to drain 10215 1727204053.86400: waiting for pending results... 10215 1727204053.86619: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0 10215 1727204053.86713: in run() - task 12b410aa-8751-3c74-8f8e-0000000003b6 10215 1727204053.86728: variable 'ansible_search_path' from source: unknown 10215 1727204053.86734: variable 'ansible_search_path' from source: unknown 10215 1727204053.86767: calling self._execute() 10215 1727204053.86848: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204053.86854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204053.86869: variable 'omit' from source: magic vars 10215 1727204053.87173: variable 'ansible_distribution_major_version' from source: facts 10215 1727204053.87184: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204053.87287: variable 'profile_stat' from source: set_fact 10215 1727204053.87302: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204053.87307: when evaluation is False, skipping this task 10215 1727204053.87310: _execute() done 10215 1727204053.87313: dumping result to json 10215 1727204053.87319: done dumping result, returning 10215 1727204053.87326: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0 [12b410aa-8751-3c74-8f8e-0000000003b6] 10215 1727204053.87333: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b6 10215 1727204053.87429: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b6 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204053.87486: no more pending results, returning what we have 10215 1727204053.87493: results queue empty 10215 1727204053.87495: checking for any_errors_fatal 10215 1727204053.87503: done checking for any_errors_fatal 10215 1727204053.87504: checking for max_fail_percentage 10215 1727204053.87506: done checking for max_fail_percentage 10215 1727204053.87507: checking to see if all hosts have failed and the running result is not ok 10215 1727204053.87508: done checking to see if all hosts have failed 10215 1727204053.87510: getting the remaining hosts for this loop 10215 1727204053.87512: done getting the remaining hosts for this loop 10215 1727204053.87517: getting the next task for host managed-node3 10215 1727204053.87523: done getting next task for host managed-node3 10215 1727204053.87526: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 10215 1727204053.87531: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204053.87536: getting variables 10215 1727204053.87538: in VariableManager get_vars() 10215 1727204053.87575: Calling all_inventory to load vars for managed-node3 10215 1727204053.87578: Calling groups_inventory to load vars for managed-node3 10215 1727204053.87580: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204053.87602: Calling all_plugins_play to load vars for managed-node3 10215 1727204053.87606: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204053.87612: WORKER PROCESS EXITING 10215 1727204053.87616: Calling groups_plugins_play to load vars for managed-node3 10215 1727204053.89536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204053.91513: done with get_vars() 10215 1727204053.91541: done getting variables 10215 1727204053.91591: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204053.91683: variable 'profile' from source: include params 10215 1727204053.91686: variable 'item' from source: include params 10215 1727204053.91736: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.057) 0:00:22.484 ***** 10215 1727204053.91764: entering _queue_task() for managed-node3/set_fact 10215 1727204053.92023: worker is 1 (out of 1 available) 10215 1727204053.92037: exiting _queue_task() for managed-node3/set_fact 10215 1727204053.92050: done queuing things up, now waiting for results queue to drain 10215 1727204053.92052: waiting for pending results... 10215 1727204053.92248: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 10215 1727204053.92343: in run() - task 12b410aa-8751-3c74-8f8e-0000000003b7 10215 1727204053.92355: variable 'ansible_search_path' from source: unknown 10215 1727204053.92360: variable 'ansible_search_path' from source: unknown 10215 1727204053.92429: calling self._execute() 10215 1727204053.92521: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204053.92525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204053.92530: variable 'omit' from source: magic vars 10215 1727204053.93027: variable 'ansible_distribution_major_version' from source: facts 10215 1727204053.93032: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204053.93163: variable 'profile_stat' from source: set_fact 10215 1727204053.93179: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204053.93182: when evaluation is False, skipping this task 10215 1727204053.93185: _execute() done 10215 1727204053.93245: dumping result to json 10215 1727204053.93249: done dumping result, returning 10215 1727204053.93252: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0 [12b410aa-8751-3c74-8f8e-0000000003b7] 10215 1727204053.93256: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b7 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204053.93483: no more pending results, returning what we have 10215 1727204053.93486: results queue empty 10215 1727204053.93487: checking for any_errors_fatal 10215 1727204053.93494: done checking for any_errors_fatal 10215 1727204053.93495: checking for max_fail_percentage 10215 1727204053.93497: done checking for max_fail_percentage 10215 1727204053.93498: checking to see if all hosts have failed and the running result is not ok 10215 1727204053.93499: done checking to see if all hosts have failed 10215 1727204053.93500: getting the remaining hosts for this loop 10215 1727204053.93502: done getting the remaining hosts for this loop 10215 1727204053.93506: getting the next task for host managed-node3 10215 1727204053.93512: done getting next task for host managed-node3 10215 1727204053.93515: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 10215 1727204053.93519: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204053.93523: getting variables 10215 1727204053.93525: in VariableManager get_vars() 10215 1727204053.93563: Calling all_inventory to load vars for managed-node3 10215 1727204053.93566: Calling groups_inventory to load vars for managed-node3 10215 1727204053.93569: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204053.93580: Calling all_plugins_play to load vars for managed-node3 10215 1727204053.93583: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204053.93587: Calling groups_plugins_play to load vars for managed-node3 10215 1727204053.93610: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b7 10215 1727204053.93613: WORKER PROCESS EXITING 10215 1727204053.94968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204053.96516: done with get_vars() 10215 1727204053.96540: done getting variables 10215 1727204053.96588: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204053.96679: variable 'profile' from source: include params 10215 1727204053.96682: variable 'item' from source: include params 10215 1727204053.96730: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.049) 0:00:22.534 ***** 10215 1727204053.96759: entering _queue_task() for managed-node3/command 10215 1727204053.97010: worker is 1 (out of 1 available) 10215 1727204053.97025: exiting _queue_task() for managed-node3/command 10215 1727204053.97039: done queuing things up, now waiting for results queue to drain 10215 1727204053.97041: waiting for pending results... 10215 1727204053.97239: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0 10215 1727204053.97337: in run() - task 12b410aa-8751-3c74-8f8e-0000000003b8 10215 1727204053.97348: variable 'ansible_search_path' from source: unknown 10215 1727204053.97352: variable 'ansible_search_path' from source: unknown 10215 1727204053.97386: calling self._execute() 10215 1727204053.97464: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204053.97472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204053.97484: variable 'omit' from source: magic vars 10215 1727204053.97784: variable 'ansible_distribution_major_version' from source: facts 10215 1727204053.97798: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204053.97900: variable 'profile_stat' from source: set_fact 10215 1727204053.97915: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204053.97920: when evaluation is False, skipping this task 10215 1727204053.97923: _execute() done 10215 1727204053.97926: dumping result to json 10215 1727204053.97928: done dumping result, returning 10215 1727204053.97937: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0 [12b410aa-8751-3c74-8f8e-0000000003b8] 10215 1727204053.97943: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b8 10215 1727204053.98031: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b8 10215 1727204053.98035: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204053.98105: no more pending results, returning what we have 10215 1727204053.98109: results queue empty 10215 1727204053.98111: checking for any_errors_fatal 10215 1727204053.98116: done checking for any_errors_fatal 10215 1727204053.98117: checking for max_fail_percentage 10215 1727204053.98119: done checking for max_fail_percentage 10215 1727204053.98120: checking to see if all hosts have failed and the running result is not ok 10215 1727204053.98121: done checking to see if all hosts have failed 10215 1727204053.98122: getting the remaining hosts for this loop 10215 1727204053.98123: done getting the remaining hosts for this loop 10215 1727204053.98127: getting the next task for host managed-node3 10215 1727204053.98135: done getting next task for host managed-node3 10215 1727204053.98137: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 10215 1727204053.98141: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204053.98145: getting variables 10215 1727204053.98148: in VariableManager get_vars() 10215 1727204053.98183: Calling all_inventory to load vars for managed-node3 10215 1727204053.98186: Calling groups_inventory to load vars for managed-node3 10215 1727204053.98191: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204053.98202: Calling all_plugins_play to load vars for managed-node3 10215 1727204053.98205: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204053.98208: Calling groups_plugins_play to load vars for managed-node3 10215 1727204053.99366: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204054.00902: done with get_vars() 10215 1727204054.00924: done getting variables 10215 1727204054.00974: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204054.01057: variable 'profile' from source: include params 10215 1727204054.01060: variable 'item' from source: include params 10215 1727204054.01110: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.043) 0:00:22.578 ***** 10215 1727204054.01134: entering _queue_task() for managed-node3/set_fact 10215 1727204054.01371: worker is 1 (out of 1 available) 10215 1727204054.01387: exiting _queue_task() for managed-node3/set_fact 10215 1727204054.01401: done queuing things up, now waiting for results queue to drain 10215 1727204054.01403: waiting for pending results... 10215 1727204054.01582: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0 10215 1727204054.01673: in run() - task 12b410aa-8751-3c74-8f8e-0000000003b9 10215 1727204054.01685: variable 'ansible_search_path' from source: unknown 10215 1727204054.01691: variable 'ansible_search_path' from source: unknown 10215 1727204054.01726: calling self._execute() 10215 1727204054.01803: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.01809: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.01822: variable 'omit' from source: magic vars 10215 1727204054.02123: variable 'ansible_distribution_major_version' from source: facts 10215 1727204054.02133: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204054.02239: variable 'profile_stat' from source: set_fact 10215 1727204054.02251: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204054.02255: when evaluation is False, skipping this task 10215 1727204054.02258: _execute() done 10215 1727204054.02261: dumping result to json 10215 1727204054.02266: done dumping result, returning 10215 1727204054.02274: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0 [12b410aa-8751-3c74-8f8e-0000000003b9] 10215 1727204054.02285: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b9 10215 1727204054.02371: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003b9 10215 1727204054.02374: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204054.02443: no more pending results, returning what we have 10215 1727204054.02446: results queue empty 10215 1727204054.02448: checking for any_errors_fatal 10215 1727204054.02452: done checking for any_errors_fatal 10215 1727204054.02453: checking for max_fail_percentage 10215 1727204054.02454: done checking for max_fail_percentage 10215 1727204054.02455: checking to see if all hosts have failed and the running result is not ok 10215 1727204054.02456: done checking to see if all hosts have failed 10215 1727204054.02457: getting the remaining hosts for this loop 10215 1727204054.02459: done getting the remaining hosts for this loop 10215 1727204054.02462: getting the next task for host managed-node3 10215 1727204054.02469: done getting next task for host managed-node3 10215 1727204054.02472: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 10215 1727204054.02475: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204054.02480: getting variables 10215 1727204054.02481: in VariableManager get_vars() 10215 1727204054.02526: Calling all_inventory to load vars for managed-node3 10215 1727204054.02529: Calling groups_inventory to load vars for managed-node3 10215 1727204054.02532: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204054.02542: Calling all_plugins_play to load vars for managed-node3 10215 1727204054.02545: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204054.02548: Calling groups_plugins_play to load vars for managed-node3 10215 1727204054.03812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204054.05340: done with get_vars() 10215 1727204054.05361: done getting variables 10215 1727204054.05411: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204054.05497: variable 'profile' from source: include params 10215 1727204054.05500: variable 'item' from source: include params 10215 1727204054.05549: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.044) 0:00:22.622 ***** 10215 1727204054.05573: entering _queue_task() for managed-node3/assert 10215 1727204054.05802: worker is 1 (out of 1 available) 10215 1727204054.05819: exiting _queue_task() for managed-node3/assert 10215 1727204054.05831: done queuing things up, now waiting for results queue to drain 10215 1727204054.05833: waiting for pending results... 10215 1727204054.06019: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0' 10215 1727204054.06103: in run() - task 12b410aa-8751-3c74-8f8e-000000000260 10215 1727204054.06118: variable 'ansible_search_path' from source: unknown 10215 1727204054.06122: variable 'ansible_search_path' from source: unknown 10215 1727204054.06153: calling self._execute() 10215 1727204054.06233: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.06240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.06249: variable 'omit' from source: magic vars 10215 1727204054.06550: variable 'ansible_distribution_major_version' from source: facts 10215 1727204054.06561: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204054.06568: variable 'omit' from source: magic vars 10215 1727204054.06602: variable 'omit' from source: magic vars 10215 1727204054.06685: variable 'profile' from source: include params 10215 1727204054.06690: variable 'item' from source: include params 10215 1727204054.06750: variable 'item' from source: include params 10215 1727204054.06766: variable 'omit' from source: magic vars 10215 1727204054.06803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204054.06839: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204054.06857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204054.06873: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204054.06884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204054.06916: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204054.06920: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.06925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.07008: Set connection var ansible_connection to ssh 10215 1727204054.07017: Set connection var ansible_pipelining to False 10215 1727204054.07024: Set connection var ansible_shell_type to sh 10215 1727204054.07031: Set connection var ansible_timeout to 10 10215 1727204054.07037: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204054.07048: Set connection var ansible_shell_executable to /bin/sh 10215 1727204054.07069: variable 'ansible_shell_executable' from source: unknown 10215 1727204054.07072: variable 'ansible_connection' from source: unknown 10215 1727204054.07074: variable 'ansible_module_compression' from source: unknown 10215 1727204054.07079: variable 'ansible_shell_type' from source: unknown 10215 1727204054.07081: variable 'ansible_shell_executable' from source: unknown 10215 1727204054.07086: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.07092: variable 'ansible_pipelining' from source: unknown 10215 1727204054.07095: variable 'ansible_timeout' from source: unknown 10215 1727204054.07101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.07223: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204054.07234: variable 'omit' from source: magic vars 10215 1727204054.07240: starting attempt loop 10215 1727204054.07243: running the handler 10215 1727204054.07338: variable 'lsr_net_profile_exists' from source: set_fact 10215 1727204054.07342: Evaluated conditional (lsr_net_profile_exists): True 10215 1727204054.07349: handler run complete 10215 1727204054.07363: attempt loop complete, returning result 10215 1727204054.07368: _execute() done 10215 1727204054.07371: dumping result to json 10215 1727204054.07374: done dumping result, returning 10215 1727204054.07386: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0' [12b410aa-8751-3c74-8f8e-000000000260] 10215 1727204054.07390: sending task result for task 12b410aa-8751-3c74-8f8e-000000000260 10215 1727204054.07477: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000260 10215 1727204054.07480: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10215 1727204054.07543: no more pending results, returning what we have 10215 1727204054.07547: results queue empty 10215 1727204054.07549: checking for any_errors_fatal 10215 1727204054.07555: done checking for any_errors_fatal 10215 1727204054.07555: checking for max_fail_percentage 10215 1727204054.07557: done checking for max_fail_percentage 10215 1727204054.07559: checking to see if all hosts have failed and the running result is not ok 10215 1727204054.07560: done checking to see if all hosts have failed 10215 1727204054.07560: getting the remaining hosts for this loop 10215 1727204054.07562: done getting the remaining hosts for this loop 10215 1727204054.07566: getting the next task for host managed-node3 10215 1727204054.07573: done getting next task for host managed-node3 10215 1727204054.07576: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 10215 1727204054.07579: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204054.07583: getting variables 10215 1727204054.07585: in VariableManager get_vars() 10215 1727204054.07624: Calling all_inventory to load vars for managed-node3 10215 1727204054.07628: Calling groups_inventory to load vars for managed-node3 10215 1727204054.07630: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204054.07641: Calling all_plugins_play to load vars for managed-node3 10215 1727204054.07644: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204054.07647: Calling groups_plugins_play to load vars for managed-node3 10215 1727204054.08949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204054.14034: done with get_vars() 10215 1727204054.14058: done getting variables 10215 1727204054.14103: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204054.14184: variable 'profile' from source: include params 10215 1727204054.14187: variable 'item' from source: include params 10215 1727204054.14239: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.086) 0:00:22.709 ***** 10215 1727204054.14265: entering _queue_task() for managed-node3/assert 10215 1727204054.14536: worker is 1 (out of 1 available) 10215 1727204054.14551: exiting _queue_task() for managed-node3/assert 10215 1727204054.14564: done queuing things up, now waiting for results queue to drain 10215 1727204054.14566: waiting for pending results... 10215 1727204054.14765: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0' 10215 1727204054.14848: in run() - task 12b410aa-8751-3c74-8f8e-000000000261 10215 1727204054.14861: variable 'ansible_search_path' from source: unknown 10215 1727204054.14865: variable 'ansible_search_path' from source: unknown 10215 1727204054.14902: calling self._execute() 10215 1727204054.14983: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.14987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.15001: variable 'omit' from source: magic vars 10215 1727204054.15332: variable 'ansible_distribution_major_version' from source: facts 10215 1727204054.15344: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204054.15352: variable 'omit' from source: magic vars 10215 1727204054.15386: variable 'omit' from source: magic vars 10215 1727204054.15475: variable 'profile' from source: include params 10215 1727204054.15481: variable 'item' from source: include params 10215 1727204054.15536: variable 'item' from source: include params 10215 1727204054.15554: variable 'omit' from source: magic vars 10215 1727204054.15594: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204054.15629: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204054.15646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204054.15665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204054.15676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204054.15709: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204054.15716: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.15720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.15806: Set connection var ansible_connection to ssh 10215 1727204054.15815: Set connection var ansible_pipelining to False 10215 1727204054.15822: Set connection var ansible_shell_type to sh 10215 1727204054.15829: Set connection var ansible_timeout to 10 10215 1727204054.15836: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204054.15844: Set connection var ansible_shell_executable to /bin/sh 10215 1727204054.15863: variable 'ansible_shell_executable' from source: unknown 10215 1727204054.15866: variable 'ansible_connection' from source: unknown 10215 1727204054.15869: variable 'ansible_module_compression' from source: unknown 10215 1727204054.15874: variable 'ansible_shell_type' from source: unknown 10215 1727204054.15877: variable 'ansible_shell_executable' from source: unknown 10215 1727204054.15880: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.15888: variable 'ansible_pipelining' from source: unknown 10215 1727204054.15892: variable 'ansible_timeout' from source: unknown 10215 1727204054.15900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.16018: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204054.16029: variable 'omit' from source: magic vars 10215 1727204054.16034: starting attempt loop 10215 1727204054.16038: running the handler 10215 1727204054.16130: variable 'lsr_net_profile_ansible_managed' from source: set_fact 10215 1727204054.16134: Evaluated conditional (lsr_net_profile_ansible_managed): True 10215 1727204054.16142: handler run complete 10215 1727204054.16156: attempt loop complete, returning result 10215 1727204054.16159: _execute() done 10215 1727204054.16162: dumping result to json 10215 1727204054.16167: done dumping result, returning 10215 1727204054.16175: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0' [12b410aa-8751-3c74-8f8e-000000000261] 10215 1727204054.16182: sending task result for task 12b410aa-8751-3c74-8f8e-000000000261 10215 1727204054.16271: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000261 10215 1727204054.16274: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10215 1727204054.16327: no more pending results, returning what we have 10215 1727204054.16330: results queue empty 10215 1727204054.16331: checking for any_errors_fatal 10215 1727204054.16337: done checking for any_errors_fatal 10215 1727204054.16339: checking for max_fail_percentage 10215 1727204054.16341: done checking for max_fail_percentage 10215 1727204054.16342: checking to see if all hosts have failed and the running result is not ok 10215 1727204054.16344: done checking to see if all hosts have failed 10215 1727204054.16344: getting the remaining hosts for this loop 10215 1727204054.16346: done getting the remaining hosts for this loop 10215 1727204054.16351: getting the next task for host managed-node3 10215 1727204054.16357: done getting next task for host managed-node3 10215 1727204054.16360: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 10215 1727204054.16363: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204054.16367: getting variables 10215 1727204054.16369: in VariableManager get_vars() 10215 1727204054.16415: Calling all_inventory to load vars for managed-node3 10215 1727204054.16418: Calling groups_inventory to load vars for managed-node3 10215 1727204054.16421: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204054.16433: Calling all_plugins_play to load vars for managed-node3 10215 1727204054.16436: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204054.16439: Calling groups_plugins_play to load vars for managed-node3 10215 1727204054.17628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204054.19176: done with get_vars() 10215 1727204054.19199: done getting variables 10215 1727204054.19248: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204054.19335: variable 'profile' from source: include params 10215 1727204054.19338: variable 'item' from source: include params 10215 1727204054.19382: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.051) 0:00:22.761 ***** 10215 1727204054.19413: entering _queue_task() for managed-node3/assert 10215 1727204054.19640: worker is 1 (out of 1 available) 10215 1727204054.19654: exiting _queue_task() for managed-node3/assert 10215 1727204054.19667: done queuing things up, now waiting for results queue to drain 10215 1727204054.19669: waiting for pending results... 10215 1727204054.19855: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0 10215 1727204054.19940: in run() - task 12b410aa-8751-3c74-8f8e-000000000262 10215 1727204054.19953: variable 'ansible_search_path' from source: unknown 10215 1727204054.19957: variable 'ansible_search_path' from source: unknown 10215 1727204054.19988: calling self._execute() 10215 1727204054.20069: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.20074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.20085: variable 'omit' from source: magic vars 10215 1727204054.20394: variable 'ansible_distribution_major_version' from source: facts 10215 1727204054.20404: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204054.20413: variable 'omit' from source: magic vars 10215 1727204054.20451: variable 'omit' from source: magic vars 10215 1727204054.20536: variable 'profile' from source: include params 10215 1727204054.20541: variable 'item' from source: include params 10215 1727204054.20598: variable 'item' from source: include params 10215 1727204054.20616: variable 'omit' from source: magic vars 10215 1727204054.20650: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204054.20684: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204054.20702: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204054.20723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204054.20734: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204054.20764: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204054.20768: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.20770: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.20857: Set connection var ansible_connection to ssh 10215 1727204054.20864: Set connection var ansible_pipelining to False 10215 1727204054.20872: Set connection var ansible_shell_type to sh 10215 1727204054.20880: Set connection var ansible_timeout to 10 10215 1727204054.20888: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204054.20899: Set connection var ansible_shell_executable to /bin/sh 10215 1727204054.20919: variable 'ansible_shell_executable' from source: unknown 10215 1727204054.20923: variable 'ansible_connection' from source: unknown 10215 1727204054.20926: variable 'ansible_module_compression' from source: unknown 10215 1727204054.20931: variable 'ansible_shell_type' from source: unknown 10215 1727204054.20933: variable 'ansible_shell_executable' from source: unknown 10215 1727204054.20936: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.20942: variable 'ansible_pipelining' from source: unknown 10215 1727204054.20946: variable 'ansible_timeout' from source: unknown 10215 1727204054.20951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.21067: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204054.21076: variable 'omit' from source: magic vars 10215 1727204054.21082: starting attempt loop 10215 1727204054.21085: running the handler 10215 1727204054.21177: variable 'lsr_net_profile_fingerprint' from source: set_fact 10215 1727204054.21181: Evaluated conditional (lsr_net_profile_fingerprint): True 10215 1727204054.21191: handler run complete 10215 1727204054.21205: attempt loop complete, returning result 10215 1727204054.21210: _execute() done 10215 1727204054.21218: dumping result to json 10215 1727204054.21223: done dumping result, returning 10215 1727204054.21226: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0 [12b410aa-8751-3c74-8f8e-000000000262] 10215 1727204054.21234: sending task result for task 12b410aa-8751-3c74-8f8e-000000000262 10215 1727204054.21321: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000262 10215 1727204054.21324: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10215 1727204054.21378: no more pending results, returning what we have 10215 1727204054.21381: results queue empty 10215 1727204054.21382: checking for any_errors_fatal 10215 1727204054.21387: done checking for any_errors_fatal 10215 1727204054.21388: checking for max_fail_percentage 10215 1727204054.21391: done checking for max_fail_percentage 10215 1727204054.21393: checking to see if all hosts have failed and the running result is not ok 10215 1727204054.21394: done checking to see if all hosts have failed 10215 1727204054.21395: getting the remaining hosts for this loop 10215 1727204054.21397: done getting the remaining hosts for this loop 10215 1727204054.21401: getting the next task for host managed-node3 10215 1727204054.21411: done getting next task for host managed-node3 10215 1727204054.21414: ^ task is: TASK: Include the task 'get_profile_stat.yml' 10215 1727204054.21417: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204054.21420: getting variables 10215 1727204054.21422: in VariableManager get_vars() 10215 1727204054.21458: Calling all_inventory to load vars for managed-node3 10215 1727204054.21461: Calling groups_inventory to load vars for managed-node3 10215 1727204054.21464: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204054.21474: Calling all_plugins_play to load vars for managed-node3 10215 1727204054.21477: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204054.21480: Calling groups_plugins_play to load vars for managed-node3 10215 1727204054.22786: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204054.24335: done with get_vars() 10215 1727204054.24361: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.050) 0:00:22.811 ***** 10215 1727204054.24447: entering _queue_task() for managed-node3/include_tasks 10215 1727204054.24712: worker is 1 (out of 1 available) 10215 1727204054.24728: exiting _queue_task() for managed-node3/include_tasks 10215 1727204054.24742: done queuing things up, now waiting for results queue to drain 10215 1727204054.24744: waiting for pending results... 10215 1727204054.24941: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 10215 1727204054.25033: in run() - task 12b410aa-8751-3c74-8f8e-000000000266 10215 1727204054.25046: variable 'ansible_search_path' from source: unknown 10215 1727204054.25049: variable 'ansible_search_path' from source: unknown 10215 1727204054.25081: calling self._execute() 10215 1727204054.25170: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.25177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.25189: variable 'omit' from source: magic vars 10215 1727204054.25522: variable 'ansible_distribution_major_version' from source: facts 10215 1727204054.25534: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204054.25542: _execute() done 10215 1727204054.25546: dumping result to json 10215 1727204054.25551: done dumping result, returning 10215 1727204054.25558: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-3c74-8f8e-000000000266] 10215 1727204054.25565: sending task result for task 12b410aa-8751-3c74-8f8e-000000000266 10215 1727204054.25662: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000266 10215 1727204054.25665: WORKER PROCESS EXITING 10215 1727204054.25697: no more pending results, returning what we have 10215 1727204054.25703: in VariableManager get_vars() 10215 1727204054.25751: Calling all_inventory to load vars for managed-node3 10215 1727204054.25754: Calling groups_inventory to load vars for managed-node3 10215 1727204054.25757: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204054.25771: Calling all_plugins_play to load vars for managed-node3 10215 1727204054.25775: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204054.25779: Calling groups_plugins_play to load vars for managed-node3 10215 1727204054.26987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204054.28768: done with get_vars() 10215 1727204054.28787: variable 'ansible_search_path' from source: unknown 10215 1727204054.28788: variable 'ansible_search_path' from source: unknown 10215 1727204054.28822: we have included files to process 10215 1727204054.28823: generating all_blocks data 10215 1727204054.28825: done generating all_blocks data 10215 1727204054.28829: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10215 1727204054.28830: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10215 1727204054.28832: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10215 1727204054.29596: done processing included file 10215 1727204054.29598: iterating over new_blocks loaded from include file 10215 1727204054.29599: in VariableManager get_vars() 10215 1727204054.29619: done with get_vars() 10215 1727204054.29621: filtering new block on tags 10215 1727204054.29641: done filtering new block on tags 10215 1727204054.29644: in VariableManager get_vars() 10215 1727204054.29658: done with get_vars() 10215 1727204054.29659: filtering new block on tags 10215 1727204054.29676: done filtering new block on tags 10215 1727204054.29678: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 10215 1727204054.29682: extending task lists for all hosts with included blocks 10215 1727204054.29864: done extending task lists 10215 1727204054.29865: done processing included files 10215 1727204054.29866: results queue empty 10215 1727204054.29867: checking for any_errors_fatal 10215 1727204054.29872: done checking for any_errors_fatal 10215 1727204054.29873: checking for max_fail_percentage 10215 1727204054.29875: done checking for max_fail_percentage 10215 1727204054.29876: checking to see if all hosts have failed and the running result is not ok 10215 1727204054.29876: done checking to see if all hosts have failed 10215 1727204054.29877: getting the remaining hosts for this loop 10215 1727204054.29879: done getting the remaining hosts for this loop 10215 1727204054.29881: getting the next task for host managed-node3 10215 1727204054.29886: done getting next task for host managed-node3 10215 1727204054.29888: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 10215 1727204054.29893: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204054.29896: getting variables 10215 1727204054.29897: in VariableManager get_vars() 10215 1727204054.29912: Calling all_inventory to load vars for managed-node3 10215 1727204054.29915: Calling groups_inventory to load vars for managed-node3 10215 1727204054.29917: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204054.29923: Calling all_plugins_play to load vars for managed-node3 10215 1727204054.29926: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204054.29929: Calling groups_plugins_play to load vars for managed-node3 10215 1727204054.31909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204054.33577: done with get_vars() 10215 1727204054.33602: done getting variables 10215 1727204054.33643: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.092) 0:00:22.903 ***** 10215 1727204054.33667: entering _queue_task() for managed-node3/set_fact 10215 1727204054.33939: worker is 1 (out of 1 available) 10215 1727204054.33956: exiting _queue_task() for managed-node3/set_fact 10215 1727204054.33968: done queuing things up, now waiting for results queue to drain 10215 1727204054.33970: waiting for pending results... 10215 1727204054.34273: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 10215 1727204054.34496: in run() - task 12b410aa-8751-3c74-8f8e-0000000003f8 10215 1727204054.34500: variable 'ansible_search_path' from source: unknown 10215 1727204054.34504: variable 'ansible_search_path' from source: unknown 10215 1727204054.34506: calling self._execute() 10215 1727204054.34531: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.34544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.34560: variable 'omit' from source: magic vars 10215 1727204054.35018: variable 'ansible_distribution_major_version' from source: facts 10215 1727204054.35038: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204054.35050: variable 'omit' from source: magic vars 10215 1727204054.35114: variable 'omit' from source: magic vars 10215 1727204054.35165: variable 'omit' from source: magic vars 10215 1727204054.35215: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204054.35262: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204054.35288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204054.35319: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204054.35336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204054.35376: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204054.35385: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.35396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.35521: Set connection var ansible_connection to ssh 10215 1727204054.35534: Set connection var ansible_pipelining to False 10215 1727204054.35547: Set connection var ansible_shell_type to sh 10215 1727204054.35559: Set connection var ansible_timeout to 10 10215 1727204054.35571: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204054.35587: Set connection var ansible_shell_executable to /bin/sh 10215 1727204054.35617: variable 'ansible_shell_executable' from source: unknown 10215 1727204054.35626: variable 'ansible_connection' from source: unknown 10215 1727204054.35634: variable 'ansible_module_compression' from source: unknown 10215 1727204054.35640: variable 'ansible_shell_type' from source: unknown 10215 1727204054.35648: variable 'ansible_shell_executable' from source: unknown 10215 1727204054.35694: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.35697: variable 'ansible_pipelining' from source: unknown 10215 1727204054.35699: variable 'ansible_timeout' from source: unknown 10215 1727204054.35702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.35848: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204054.35867: variable 'omit' from source: magic vars 10215 1727204054.35881: starting attempt loop 10215 1727204054.35888: running the handler 10215 1727204054.35912: handler run complete 10215 1727204054.35967: attempt loop complete, returning result 10215 1727204054.35970: _execute() done 10215 1727204054.35972: dumping result to json 10215 1727204054.35975: done dumping result, returning 10215 1727204054.35977: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-3c74-8f8e-0000000003f8] 10215 1727204054.35980: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003f8 ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 10215 1727204054.36147: no more pending results, returning what we have 10215 1727204054.36151: results queue empty 10215 1727204054.36152: checking for any_errors_fatal 10215 1727204054.36155: done checking for any_errors_fatal 10215 1727204054.36156: checking for max_fail_percentage 10215 1727204054.36158: done checking for max_fail_percentage 10215 1727204054.36159: checking to see if all hosts have failed and the running result is not ok 10215 1727204054.36160: done checking to see if all hosts have failed 10215 1727204054.36161: getting the remaining hosts for this loop 10215 1727204054.36162: done getting the remaining hosts for this loop 10215 1727204054.36167: getting the next task for host managed-node3 10215 1727204054.36175: done getting next task for host managed-node3 10215 1727204054.36178: ^ task is: TASK: Stat profile file 10215 1727204054.36182: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204054.36186: getting variables 10215 1727204054.36188: in VariableManager get_vars() 10215 1727204054.36348: Calling all_inventory to load vars for managed-node3 10215 1727204054.36352: Calling groups_inventory to load vars for managed-node3 10215 1727204054.36354: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204054.36366: Calling all_plugins_play to load vars for managed-node3 10215 1727204054.36369: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204054.36373: Calling groups_plugins_play to load vars for managed-node3 10215 1727204054.36896: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003f8 10215 1727204054.36899: WORKER PROCESS EXITING 10215 1727204054.38785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204054.41948: done with get_vars() 10215 1727204054.42000: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.084) 0:00:22.988 ***** 10215 1727204054.42129: entering _queue_task() for managed-node3/stat 10215 1727204054.42523: worker is 1 (out of 1 available) 10215 1727204054.42544: exiting _queue_task() for managed-node3/stat 10215 1727204054.42556: done queuing things up, now waiting for results queue to drain 10215 1727204054.42559: waiting for pending results... 10215 1727204054.42876: running TaskExecutor() for managed-node3/TASK: Stat profile file 10215 1727204054.43096: in run() - task 12b410aa-8751-3c74-8f8e-0000000003f9 10215 1727204054.43101: variable 'ansible_search_path' from source: unknown 10215 1727204054.43104: variable 'ansible_search_path' from source: unknown 10215 1727204054.43110: calling self._execute() 10215 1727204054.43183: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.43192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.43205: variable 'omit' from source: magic vars 10215 1727204054.43710: variable 'ansible_distribution_major_version' from source: facts 10215 1727204054.43714: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204054.43720: variable 'omit' from source: magic vars 10215 1727204054.43786: variable 'omit' from source: magic vars 10215 1727204054.43913: variable 'profile' from source: include params 10215 1727204054.43924: variable 'item' from source: include params 10215 1727204054.44004: variable 'item' from source: include params 10215 1727204054.44022: variable 'omit' from source: magic vars 10215 1727204054.44068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204054.44113: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204054.44201: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204054.44205: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204054.44211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204054.44214: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204054.44217: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.44219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.44455: Set connection var ansible_connection to ssh 10215 1727204054.44459: Set connection var ansible_pipelining to False 10215 1727204054.44464: Set connection var ansible_shell_type to sh 10215 1727204054.44466: Set connection var ansible_timeout to 10 10215 1727204054.44469: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204054.44472: Set connection var ansible_shell_executable to /bin/sh 10215 1727204054.44474: variable 'ansible_shell_executable' from source: unknown 10215 1727204054.44476: variable 'ansible_connection' from source: unknown 10215 1727204054.44479: variable 'ansible_module_compression' from source: unknown 10215 1727204054.44481: variable 'ansible_shell_type' from source: unknown 10215 1727204054.44483: variable 'ansible_shell_executable' from source: unknown 10215 1727204054.44485: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.44487: variable 'ansible_pipelining' from source: unknown 10215 1727204054.44491: variable 'ansible_timeout' from source: unknown 10215 1727204054.44493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.44734: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204054.44744: variable 'omit' from source: magic vars 10215 1727204054.44750: starting attempt loop 10215 1727204054.44753: running the handler 10215 1727204054.44774: _low_level_execute_command(): starting 10215 1727204054.44778: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204054.45283: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204054.45323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204054.45328: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204054.45331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204054.45381: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204054.45386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204054.45431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204054.47198: stdout chunk (state=3): >>>/root <<< 10215 1727204054.47305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204054.47350: stderr chunk (state=3): >>><<< 10215 1727204054.47354: stdout chunk (state=3): >>><<< 10215 1727204054.47381: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204054.47395: _low_level_execute_command(): starting 10215 1727204054.47402: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959 `" && echo ansible-tmp-1727204054.4738133-11569-52367431006959="` echo /root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959 `" ) && sleep 0' 10215 1727204054.47853: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204054.47856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204054.47866: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204054.47870: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204054.47872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204054.47922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204054.47927: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204054.47967: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204054.49935: stdout chunk (state=3): >>>ansible-tmp-1727204054.4738133-11569-52367431006959=/root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959 <<< 10215 1727204054.50114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204054.50141: stderr chunk (state=3): >>><<< 10215 1727204054.50156: stdout chunk (state=3): >>><<< 10215 1727204054.50179: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204054.4738133-11569-52367431006959=/root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204054.50252: variable 'ansible_module_compression' from source: unknown 10215 1727204054.50332: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10215 1727204054.50496: variable 'ansible_facts' from source: unknown 10215 1727204054.50499: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959/AnsiballZ_stat.py 10215 1727204054.50749: Sending initial data 10215 1727204054.50764: Sent initial data (152 bytes) 10215 1727204054.51302: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204054.51314: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204054.51327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204054.51410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204054.51444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204054.51457: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204054.51467: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204054.51541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204054.53177: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10215 1727204054.53215: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204054.53253: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204054.53305: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpj0rqnei8 /root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959/AnsiballZ_stat.py <<< 10215 1727204054.53310: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959/AnsiballZ_stat.py" <<< 10215 1727204054.53342: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpj0rqnei8" to remote "/root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959/AnsiballZ_stat.py" <<< 10215 1727204054.66444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204054.66484: stderr chunk (state=3): >>><<< 10215 1727204054.66497: stdout chunk (state=3): >>><<< 10215 1727204054.66533: done transferring module to remote 10215 1727204054.66562: _low_level_execute_command(): starting 10215 1727204054.66572: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959/ /root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959/AnsiballZ_stat.py && sleep 0' 10215 1727204054.67288: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204054.67319: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204054.67339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204054.67406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204054.67494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204054.67518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204054.67564: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204054.67609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204054.69701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204054.69705: stdout chunk (state=3): >>><<< 10215 1727204054.69710: stderr chunk (state=3): >>><<< 10215 1727204054.69712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204054.69715: _low_level_execute_command(): starting 10215 1727204054.69717: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959/AnsiballZ_stat.py && sleep 0' 10215 1727204054.70382: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204054.70515: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10215 1727204054.70538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204054.70556: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204054.70650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204054.87971: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10215 1727204054.89444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204054.89506: stderr chunk (state=3): >>><<< 10215 1727204054.89510: stdout chunk (state=3): >>><<< 10215 1727204054.89529: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204054.89556: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204054.89566: _low_level_execute_command(): starting 10215 1727204054.89572: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204054.4738133-11569-52367431006959/ > /dev/null 2>&1 && sleep 0' 10215 1727204054.90046: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204054.90050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204054.90053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204054.90055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204054.90109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204054.90114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204054.90151: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204054.92090: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204054.92140: stderr chunk (state=3): >>><<< 10215 1727204054.92143: stdout chunk (state=3): >>><<< 10215 1727204054.92159: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204054.92166: handler run complete 10215 1727204054.92185: attempt loop complete, returning result 10215 1727204054.92188: _execute() done 10215 1727204054.92195: dumping result to json 10215 1727204054.92201: done dumping result, returning 10215 1727204054.92214: done running TaskExecutor() for managed-node3/TASK: Stat profile file [12b410aa-8751-3c74-8f8e-0000000003f9] 10215 1727204054.92217: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003f9 10215 1727204054.92324: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003f9 10215 1727204054.92328: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 10215 1727204054.92402: no more pending results, returning what we have 10215 1727204054.92406: results queue empty 10215 1727204054.92409: checking for any_errors_fatal 10215 1727204054.92416: done checking for any_errors_fatal 10215 1727204054.92416: checking for max_fail_percentage 10215 1727204054.92418: done checking for max_fail_percentage 10215 1727204054.92419: checking to see if all hosts have failed and the running result is not ok 10215 1727204054.92421: done checking to see if all hosts have failed 10215 1727204054.92421: getting the remaining hosts for this loop 10215 1727204054.92423: done getting the remaining hosts for this loop 10215 1727204054.92427: getting the next task for host managed-node3 10215 1727204054.92435: done getting next task for host managed-node3 10215 1727204054.92439: ^ task is: TASK: Set NM profile exist flag based on the profile files 10215 1727204054.92444: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204054.92448: getting variables 10215 1727204054.92450: in VariableManager get_vars() 10215 1727204054.92500: Calling all_inventory to load vars for managed-node3 10215 1727204054.92503: Calling groups_inventory to load vars for managed-node3 10215 1727204054.92508: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204054.92521: Calling all_plugins_play to load vars for managed-node3 10215 1727204054.92523: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204054.92527: Calling groups_plugins_play to load vars for managed-node3 10215 1727204054.93867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204054.95419: done with get_vars() 10215 1727204054.95447: done getting variables 10215 1727204054.95501: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.533) 0:00:23.522 ***** 10215 1727204054.95530: entering _queue_task() for managed-node3/set_fact 10215 1727204054.95803: worker is 1 (out of 1 available) 10215 1727204054.95822: exiting _queue_task() for managed-node3/set_fact 10215 1727204054.95836: done queuing things up, now waiting for results queue to drain 10215 1727204054.95838: waiting for pending results... 10215 1727204054.96029: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 10215 1727204054.96124: in run() - task 12b410aa-8751-3c74-8f8e-0000000003fa 10215 1727204054.96136: variable 'ansible_search_path' from source: unknown 10215 1727204054.96140: variable 'ansible_search_path' from source: unknown 10215 1727204054.96180: calling self._execute() 10215 1727204054.96257: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204054.96263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204054.96273: variable 'omit' from source: magic vars 10215 1727204054.96593: variable 'ansible_distribution_major_version' from source: facts 10215 1727204054.96604: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204054.96712: variable 'profile_stat' from source: set_fact 10215 1727204054.96727: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204054.96732: when evaluation is False, skipping this task 10215 1727204054.96736: _execute() done 10215 1727204054.96739: dumping result to json 10215 1727204054.96742: done dumping result, returning 10215 1727204054.96746: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-3c74-8f8e-0000000003fa] 10215 1727204054.96754: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003fa 10215 1727204054.96845: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003fa 10215 1727204054.96848: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204054.96909: no more pending results, returning what we have 10215 1727204054.96914: results queue empty 10215 1727204054.96915: checking for any_errors_fatal 10215 1727204054.96925: done checking for any_errors_fatal 10215 1727204054.96926: checking for max_fail_percentage 10215 1727204054.96928: done checking for max_fail_percentage 10215 1727204054.96929: checking to see if all hosts have failed and the running result is not ok 10215 1727204054.96930: done checking to see if all hosts have failed 10215 1727204054.96931: getting the remaining hosts for this loop 10215 1727204054.96932: done getting the remaining hosts for this loop 10215 1727204054.96936: getting the next task for host managed-node3 10215 1727204054.96943: done getting next task for host managed-node3 10215 1727204054.96946: ^ task is: TASK: Get NM profile info 10215 1727204054.96950: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204054.96954: getting variables 10215 1727204054.96956: in VariableManager get_vars() 10215 1727204054.97003: Calling all_inventory to load vars for managed-node3 10215 1727204054.97006: Calling groups_inventory to load vars for managed-node3 10215 1727204054.97011: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204054.97023: Calling all_plugins_play to load vars for managed-node3 10215 1727204054.97026: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204054.97029: Calling groups_plugins_play to load vars for managed-node3 10215 1727204054.98218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204054.99790: done with get_vars() 10215 1727204054.99814: done getting variables 10215 1727204054.99861: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:14 -0400 (0:00:00.043) 0:00:23.566 ***** 10215 1727204054.99887: entering _queue_task() for managed-node3/shell 10215 1727204055.00129: worker is 1 (out of 1 available) 10215 1727204055.00145: exiting _queue_task() for managed-node3/shell 10215 1727204055.00159: done queuing things up, now waiting for results queue to drain 10215 1727204055.00160: waiting for pending results... 10215 1727204055.00341: running TaskExecutor() for managed-node3/TASK: Get NM profile info 10215 1727204055.00425: in run() - task 12b410aa-8751-3c74-8f8e-0000000003fb 10215 1727204055.00439: variable 'ansible_search_path' from source: unknown 10215 1727204055.00442: variable 'ansible_search_path' from source: unknown 10215 1727204055.00473: calling self._execute() 10215 1727204055.00558: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.00562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.00574: variable 'omit' from source: magic vars 10215 1727204055.00894: variable 'ansible_distribution_major_version' from source: facts 10215 1727204055.00906: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204055.00912: variable 'omit' from source: magic vars 10215 1727204055.00959: variable 'omit' from source: magic vars 10215 1727204055.01049: variable 'profile' from source: include params 10215 1727204055.01053: variable 'item' from source: include params 10215 1727204055.01111: variable 'item' from source: include params 10215 1727204055.01126: variable 'omit' from source: magic vars 10215 1727204055.01169: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204055.01203: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204055.01223: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204055.01239: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204055.01250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204055.01281: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204055.01285: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.01291: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.01373: Set connection var ansible_connection to ssh 10215 1727204055.01382: Set connection var ansible_pipelining to False 10215 1727204055.01389: Set connection var ansible_shell_type to sh 10215 1727204055.01397: Set connection var ansible_timeout to 10 10215 1727204055.01404: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204055.01414: Set connection var ansible_shell_executable to /bin/sh 10215 1727204055.01434: variable 'ansible_shell_executable' from source: unknown 10215 1727204055.01437: variable 'ansible_connection' from source: unknown 10215 1727204055.01440: variable 'ansible_module_compression' from source: unknown 10215 1727204055.01445: variable 'ansible_shell_type' from source: unknown 10215 1727204055.01448: variable 'ansible_shell_executable' from source: unknown 10215 1727204055.01452: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.01457: variable 'ansible_pipelining' from source: unknown 10215 1727204055.01460: variable 'ansible_timeout' from source: unknown 10215 1727204055.01466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.01591: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204055.01599: variable 'omit' from source: magic vars 10215 1727204055.01611: starting attempt loop 10215 1727204055.01614: running the handler 10215 1727204055.01621: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204055.01641: _low_level_execute_command(): starting 10215 1727204055.01648: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204055.02195: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204055.02206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204055.02222: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204055.02284: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204055.02293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204055.02295: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204055.02332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204055.04083: stdout chunk (state=3): >>>/root <<< 10215 1727204055.04196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204055.04252: stderr chunk (state=3): >>><<< 10215 1727204055.04255: stdout chunk (state=3): >>><<< 10215 1727204055.04278: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204055.04291: _low_level_execute_command(): starting 10215 1727204055.04299: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771 `" && echo ansible-tmp-1727204055.0427792-11591-61089128710771="` echo /root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771 `" ) && sleep 0' 10215 1727204055.04798: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204055.04809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204055.04811: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10215 1727204055.04814: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204055.04817: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204055.04872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204055.04875: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204055.04911: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204055.06907: stdout chunk (state=3): >>>ansible-tmp-1727204055.0427792-11591-61089128710771=/root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771 <<< 10215 1727204055.07031: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204055.07077: stderr chunk (state=3): >>><<< 10215 1727204055.07081: stdout chunk (state=3): >>><<< 10215 1727204055.07101: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204055.0427792-11591-61089128710771=/root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204055.07135: variable 'ansible_module_compression' from source: unknown 10215 1727204055.07177: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10215 1727204055.07216: variable 'ansible_facts' from source: unknown 10215 1727204055.07282: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771/AnsiballZ_command.py 10215 1727204055.07402: Sending initial data 10215 1727204055.07406: Sent initial data (155 bytes) 10215 1727204055.07847: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204055.07897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204055.07900: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204055.07903: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204055.07905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204055.07907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204055.07952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204055.07956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204055.07997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204055.09631: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204055.09663: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204055.09723: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmp9wvyp399 /root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771/AnsiballZ_command.py <<< 10215 1727204055.09730: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771/AnsiballZ_command.py" <<< 10215 1727204055.09755: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmp9wvyp399" to remote "/root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771/AnsiballZ_command.py" <<< 10215 1727204055.10539: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204055.10602: stderr chunk (state=3): >>><<< 10215 1727204055.10606: stdout chunk (state=3): >>><<< 10215 1727204055.10628: done transferring module to remote 10215 1727204055.10638: _low_level_execute_command(): starting 10215 1727204055.10643: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771/ /root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771/AnsiballZ_command.py && sleep 0' 10215 1727204055.11075: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204055.11113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204055.11117: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204055.11119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204055.11125: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204055.11127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204055.11180: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204055.11183: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204055.11225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204055.13106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204055.13156: stderr chunk (state=3): >>><<< 10215 1727204055.13159: stdout chunk (state=3): >>><<< 10215 1727204055.13175: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204055.13178: _low_level_execute_command(): starting 10215 1727204055.13187: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771/AnsiballZ_command.py && sleep 0' 10215 1727204055.13665: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204055.13668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204055.13671: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204055.13673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204055.13731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204055.13738: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204055.13740: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204055.13781: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204055.33345: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:15.309466", "end": "2024-09-24 14:54:15.332438", "delta": "0:00:00.022972", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10215 1727204055.35062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204055.35128: stderr chunk (state=3): >>><<< 10215 1727204055.35131: stdout chunk (state=3): >>><<< 10215 1727204055.35150: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:15.309466", "end": "2024-09-24 14:54:15.332438", "delta": "0:00:00.022972", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204055.35191: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204055.35200: _low_level_execute_command(): starting 10215 1727204055.35209: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204055.0427792-11591-61089128710771/ > /dev/null 2>&1 && sleep 0' 10215 1727204055.35664: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204055.35674: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204055.35704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204055.35710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204055.35712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204055.35767: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204055.35771: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204055.35813: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204055.37711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204055.37763: stderr chunk (state=3): >>><<< 10215 1727204055.37767: stdout chunk (state=3): >>><<< 10215 1727204055.37787: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204055.37797: handler run complete 10215 1727204055.37822: Evaluated conditional (False): False 10215 1727204055.37832: attempt loop complete, returning result 10215 1727204055.37835: _execute() done 10215 1727204055.37840: dumping result to json 10215 1727204055.37846: done dumping result, returning 10215 1727204055.37854: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [12b410aa-8751-3c74-8f8e-0000000003fb] 10215 1727204055.37860: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003fb 10215 1727204055.37972: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003fb 10215 1727204055.37975: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.022972", "end": "2024-09-24 14:54:15.332438", "rc": 0, "start": "2024-09-24 14:54:15.309466" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 10215 1727204055.38075: no more pending results, returning what we have 10215 1727204055.38079: results queue empty 10215 1727204055.38080: checking for any_errors_fatal 10215 1727204055.38088: done checking for any_errors_fatal 10215 1727204055.38089: checking for max_fail_percentage 10215 1727204055.38091: done checking for max_fail_percentage 10215 1727204055.38092: checking to see if all hosts have failed and the running result is not ok 10215 1727204055.38093: done checking to see if all hosts have failed 10215 1727204055.38094: getting the remaining hosts for this loop 10215 1727204055.38096: done getting the remaining hosts for this loop 10215 1727204055.38102: getting the next task for host managed-node3 10215 1727204055.38111: done getting next task for host managed-node3 10215 1727204055.38114: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10215 1727204055.38118: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204055.38123: getting variables 10215 1727204055.38124: in VariableManager get_vars() 10215 1727204055.38164: Calling all_inventory to load vars for managed-node3 10215 1727204055.38167: Calling groups_inventory to load vars for managed-node3 10215 1727204055.38169: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204055.38181: Calling all_plugins_play to load vars for managed-node3 10215 1727204055.38184: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204055.38187: Calling groups_plugins_play to load vars for managed-node3 10215 1727204055.39549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204055.41111: done with get_vars() 10215 1727204055.41142: done getting variables 10215 1727204055.41195: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.413) 0:00:23.979 ***** 10215 1727204055.41223: entering _queue_task() for managed-node3/set_fact 10215 1727204055.41501: worker is 1 (out of 1 available) 10215 1727204055.41518: exiting _queue_task() for managed-node3/set_fact 10215 1727204055.41532: done queuing things up, now waiting for results queue to drain 10215 1727204055.41534: waiting for pending results... 10215 1727204055.41736: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10215 1727204055.41828: in run() - task 12b410aa-8751-3c74-8f8e-0000000003fc 10215 1727204055.41842: variable 'ansible_search_path' from source: unknown 10215 1727204055.41846: variable 'ansible_search_path' from source: unknown 10215 1727204055.41882: calling self._execute() 10215 1727204055.41968: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.41975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.42094: variable 'omit' from source: magic vars 10215 1727204055.42330: variable 'ansible_distribution_major_version' from source: facts 10215 1727204055.42334: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204055.42439: variable 'nm_profile_exists' from source: set_fact 10215 1727204055.42454: Evaluated conditional (nm_profile_exists.rc == 0): True 10215 1727204055.42461: variable 'omit' from source: magic vars 10215 1727204055.42499: variable 'omit' from source: magic vars 10215 1727204055.42528: variable 'omit' from source: magic vars 10215 1727204055.42567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204055.42602: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204055.42622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204055.42639: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204055.42651: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204055.42680: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204055.42684: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.42688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.42775: Set connection var ansible_connection to ssh 10215 1727204055.42782: Set connection var ansible_pipelining to False 10215 1727204055.42788: Set connection var ansible_shell_type to sh 10215 1727204055.42797: Set connection var ansible_timeout to 10 10215 1727204055.42803: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204055.42812: Set connection var ansible_shell_executable to /bin/sh 10215 1727204055.42830: variable 'ansible_shell_executable' from source: unknown 10215 1727204055.42833: variable 'ansible_connection' from source: unknown 10215 1727204055.42837: variable 'ansible_module_compression' from source: unknown 10215 1727204055.42841: variable 'ansible_shell_type' from source: unknown 10215 1727204055.42845: variable 'ansible_shell_executable' from source: unknown 10215 1727204055.42849: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.42854: variable 'ansible_pipelining' from source: unknown 10215 1727204055.42861: variable 'ansible_timeout' from source: unknown 10215 1727204055.42864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.42991: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204055.42996: variable 'omit' from source: magic vars 10215 1727204055.43002: starting attempt loop 10215 1727204055.43005: running the handler 10215 1727204055.43019: handler run complete 10215 1727204055.43029: attempt loop complete, returning result 10215 1727204055.43032: _execute() done 10215 1727204055.43036: dumping result to json 10215 1727204055.43040: done dumping result, returning 10215 1727204055.43049: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-3c74-8f8e-0000000003fc] 10215 1727204055.43056: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003fc 10215 1727204055.43145: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003fc 10215 1727204055.43148: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 10215 1727204055.43213: no more pending results, returning what we have 10215 1727204055.43216: results queue empty 10215 1727204055.43217: checking for any_errors_fatal 10215 1727204055.43227: done checking for any_errors_fatal 10215 1727204055.43230: checking for max_fail_percentage 10215 1727204055.43232: done checking for max_fail_percentage 10215 1727204055.43233: checking to see if all hosts have failed and the running result is not ok 10215 1727204055.43234: done checking to see if all hosts have failed 10215 1727204055.43235: getting the remaining hosts for this loop 10215 1727204055.43237: done getting the remaining hosts for this loop 10215 1727204055.43241: getting the next task for host managed-node3 10215 1727204055.43250: done getting next task for host managed-node3 10215 1727204055.43252: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 10215 1727204055.43257: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204055.43260: getting variables 10215 1727204055.43262: in VariableManager get_vars() 10215 1727204055.43311: Calling all_inventory to load vars for managed-node3 10215 1727204055.43314: Calling groups_inventory to load vars for managed-node3 10215 1727204055.43317: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204055.43327: Calling all_plugins_play to load vars for managed-node3 10215 1727204055.43330: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204055.43333: Calling groups_plugins_play to load vars for managed-node3 10215 1727204055.44653: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204055.46208: done with get_vars() 10215 1727204055.46234: done getting variables 10215 1727204055.46283: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204055.46383: variable 'profile' from source: include params 10215 1727204055.46386: variable 'item' from source: include params 10215 1727204055.46438: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.052) 0:00:24.031 ***** 10215 1727204055.46470: entering _queue_task() for managed-node3/command 10215 1727204055.46735: worker is 1 (out of 1 available) 10215 1727204055.46751: exiting _queue_task() for managed-node3/command 10215 1727204055.46764: done queuing things up, now waiting for results queue to drain 10215 1727204055.46766: waiting for pending results... 10215 1727204055.46954: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 10215 1727204055.47048: in run() - task 12b410aa-8751-3c74-8f8e-0000000003fe 10215 1727204055.47060: variable 'ansible_search_path' from source: unknown 10215 1727204055.47064: variable 'ansible_search_path' from source: unknown 10215 1727204055.47098: calling self._execute() 10215 1727204055.47179: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.47185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.47198: variable 'omit' from source: magic vars 10215 1727204055.47504: variable 'ansible_distribution_major_version' from source: facts 10215 1727204055.47516: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204055.47621: variable 'profile_stat' from source: set_fact 10215 1727204055.47633: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204055.47637: when evaluation is False, skipping this task 10215 1727204055.47640: _execute() done 10215 1727204055.47645: dumping result to json 10215 1727204055.47648: done dumping result, returning 10215 1727204055.47658: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [12b410aa-8751-3c74-8f8e-0000000003fe] 10215 1727204055.47667: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003fe 10215 1727204055.47754: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003fe 10215 1727204055.47759: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204055.47821: no more pending results, returning what we have 10215 1727204055.47826: results queue empty 10215 1727204055.47827: checking for any_errors_fatal 10215 1727204055.47835: done checking for any_errors_fatal 10215 1727204055.47836: checking for max_fail_percentage 10215 1727204055.47838: done checking for max_fail_percentage 10215 1727204055.47839: checking to see if all hosts have failed and the running result is not ok 10215 1727204055.47840: done checking to see if all hosts have failed 10215 1727204055.47841: getting the remaining hosts for this loop 10215 1727204055.47843: done getting the remaining hosts for this loop 10215 1727204055.47847: getting the next task for host managed-node3 10215 1727204055.47854: done getting next task for host managed-node3 10215 1727204055.47857: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 10215 1727204055.47861: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204055.47865: getting variables 10215 1727204055.47866: in VariableManager get_vars() 10215 1727204055.47906: Calling all_inventory to load vars for managed-node3 10215 1727204055.47911: Calling groups_inventory to load vars for managed-node3 10215 1727204055.47914: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204055.47925: Calling all_plugins_play to load vars for managed-node3 10215 1727204055.47928: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204055.47931: Calling groups_plugins_play to load vars for managed-node3 10215 1727204055.49122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204055.50866: done with get_vars() 10215 1727204055.50894: done getting variables 10215 1727204055.50945: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204055.51038: variable 'profile' from source: include params 10215 1727204055.51041: variable 'item' from source: include params 10215 1727204055.51087: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.046) 0:00:24.078 ***** 10215 1727204055.51117: entering _queue_task() for managed-node3/set_fact 10215 1727204055.51377: worker is 1 (out of 1 available) 10215 1727204055.51395: exiting _queue_task() for managed-node3/set_fact 10215 1727204055.51408: done queuing things up, now waiting for results queue to drain 10215 1727204055.51410: waiting for pending results... 10215 1727204055.51616: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 10215 1727204055.51712: in run() - task 12b410aa-8751-3c74-8f8e-0000000003ff 10215 1727204055.51730: variable 'ansible_search_path' from source: unknown 10215 1727204055.51735: variable 'ansible_search_path' from source: unknown 10215 1727204055.51767: calling self._execute() 10215 1727204055.51850: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.51853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.51868: variable 'omit' from source: magic vars 10215 1727204055.52185: variable 'ansible_distribution_major_version' from source: facts 10215 1727204055.52198: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204055.52303: variable 'profile_stat' from source: set_fact 10215 1727204055.52318: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204055.52322: when evaluation is False, skipping this task 10215 1727204055.52325: _execute() done 10215 1727204055.52330: dumping result to json 10215 1727204055.52333: done dumping result, returning 10215 1727204055.52342: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [12b410aa-8751-3c74-8f8e-0000000003ff] 10215 1727204055.52348: sending task result for task 12b410aa-8751-3c74-8f8e-0000000003ff 10215 1727204055.52443: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000003ff 10215 1727204055.52446: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204055.52506: no more pending results, returning what we have 10215 1727204055.52511: results queue empty 10215 1727204055.52512: checking for any_errors_fatal 10215 1727204055.52520: done checking for any_errors_fatal 10215 1727204055.52521: checking for max_fail_percentage 10215 1727204055.52523: done checking for max_fail_percentage 10215 1727204055.52524: checking to see if all hosts have failed and the running result is not ok 10215 1727204055.52525: done checking to see if all hosts have failed 10215 1727204055.52526: getting the remaining hosts for this loop 10215 1727204055.52528: done getting the remaining hosts for this loop 10215 1727204055.52532: getting the next task for host managed-node3 10215 1727204055.52540: done getting next task for host managed-node3 10215 1727204055.52542: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 10215 1727204055.52546: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204055.52551: getting variables 10215 1727204055.52553: in VariableManager get_vars() 10215 1727204055.52602: Calling all_inventory to load vars for managed-node3 10215 1727204055.52605: Calling groups_inventory to load vars for managed-node3 10215 1727204055.52608: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204055.52621: Calling all_plugins_play to load vars for managed-node3 10215 1727204055.52624: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204055.52627: Calling groups_plugins_play to load vars for managed-node3 10215 1727204055.54953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204055.58013: done with get_vars() 10215 1727204055.58059: done getting variables 10215 1727204055.58144: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204055.58282: variable 'profile' from source: include params 10215 1727204055.58287: variable 'item' from source: include params 10215 1727204055.58371: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.072) 0:00:24.151 ***** 10215 1727204055.58405: entering _queue_task() for managed-node3/command 10215 1727204055.58684: worker is 1 (out of 1 available) 10215 1727204055.58701: exiting _queue_task() for managed-node3/command 10215 1727204055.58714: done queuing things up, now waiting for results queue to drain 10215 1727204055.58716: waiting for pending results... 10215 1727204055.58920: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 10215 1727204055.59023: in run() - task 12b410aa-8751-3c74-8f8e-000000000400 10215 1727204055.59036: variable 'ansible_search_path' from source: unknown 10215 1727204055.59040: variable 'ansible_search_path' from source: unknown 10215 1727204055.59074: calling self._execute() 10215 1727204055.59159: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.59162: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.59175: variable 'omit' from source: magic vars 10215 1727204055.59995: variable 'ansible_distribution_major_version' from source: facts 10215 1727204055.59999: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204055.60003: variable 'profile_stat' from source: set_fact 10215 1727204055.60006: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204055.60012: when evaluation is False, skipping this task 10215 1727204055.60014: _execute() done 10215 1727204055.60017: dumping result to json 10215 1727204055.60019: done dumping result, returning 10215 1727204055.60022: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0.0 [12b410aa-8751-3c74-8f8e-000000000400] 10215 1727204055.60035: sending task result for task 12b410aa-8751-3c74-8f8e-000000000400 10215 1727204055.60159: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000400 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204055.60230: no more pending results, returning what we have 10215 1727204055.60234: results queue empty 10215 1727204055.60235: checking for any_errors_fatal 10215 1727204055.60244: done checking for any_errors_fatal 10215 1727204055.60245: checking for max_fail_percentage 10215 1727204055.60246: done checking for max_fail_percentage 10215 1727204055.60247: checking to see if all hosts have failed and the running result is not ok 10215 1727204055.60248: done checking to see if all hosts have failed 10215 1727204055.60249: getting the remaining hosts for this loop 10215 1727204055.60251: done getting the remaining hosts for this loop 10215 1727204055.60255: getting the next task for host managed-node3 10215 1727204055.60263: done getting next task for host managed-node3 10215 1727204055.60265: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 10215 1727204055.60273: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204055.60279: getting variables 10215 1727204055.60281: in VariableManager get_vars() 10215 1727204055.60327: Calling all_inventory to load vars for managed-node3 10215 1727204055.60331: Calling groups_inventory to load vars for managed-node3 10215 1727204055.60334: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204055.60496: WORKER PROCESS EXITING 10215 1727204055.60520: Calling all_plugins_play to load vars for managed-node3 10215 1727204055.60525: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204055.60529: Calling groups_plugins_play to load vars for managed-node3 10215 1727204055.63369: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204055.66641: done with get_vars() 10215 1727204055.66705: done getting variables 10215 1727204055.66799: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204055.66940: variable 'profile' from source: include params 10215 1727204055.66945: variable 'item' from source: include params 10215 1727204055.67027: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.086) 0:00:24.237 ***** 10215 1727204055.67064: entering _queue_task() for managed-node3/set_fact 10215 1727204055.67527: worker is 1 (out of 1 available) 10215 1727204055.67706: exiting _queue_task() for managed-node3/set_fact 10215 1727204055.67719: done queuing things up, now waiting for results queue to drain 10215 1727204055.67721: waiting for pending results... 10215 1727204055.67879: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 10215 1727204055.68034: in run() - task 12b410aa-8751-3c74-8f8e-000000000401 10215 1727204055.68067: variable 'ansible_search_path' from source: unknown 10215 1727204055.68076: variable 'ansible_search_path' from source: unknown 10215 1727204055.68124: calling self._execute() 10215 1727204055.68280: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.68284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.68286: variable 'omit' from source: magic vars 10215 1727204055.68740: variable 'ansible_distribution_major_version' from source: facts 10215 1727204055.68759: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204055.68934: variable 'profile_stat' from source: set_fact 10215 1727204055.68962: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204055.68998: when evaluation is False, skipping this task 10215 1727204055.69002: _execute() done 10215 1727204055.69005: dumping result to json 10215 1727204055.69007: done dumping result, returning 10215 1727204055.69010: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [12b410aa-8751-3c74-8f8e-000000000401] 10215 1727204055.69019: sending task result for task 12b410aa-8751-3c74-8f8e-000000000401 10215 1727204055.69227: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000401 10215 1727204055.69230: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204055.69294: no more pending results, returning what we have 10215 1727204055.69299: results queue empty 10215 1727204055.69301: checking for any_errors_fatal 10215 1727204055.69308: done checking for any_errors_fatal 10215 1727204055.69309: checking for max_fail_percentage 10215 1727204055.69311: done checking for max_fail_percentage 10215 1727204055.69312: checking to see if all hosts have failed and the running result is not ok 10215 1727204055.69313: done checking to see if all hosts have failed 10215 1727204055.69314: getting the remaining hosts for this loop 10215 1727204055.69316: done getting the remaining hosts for this loop 10215 1727204055.69321: getting the next task for host managed-node3 10215 1727204055.69331: done getting next task for host managed-node3 10215 1727204055.69334: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 10215 1727204055.69339: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204055.69345: getting variables 10215 1727204055.69347: in VariableManager get_vars() 10215 1727204055.69507: Calling all_inventory to load vars for managed-node3 10215 1727204055.69512: Calling groups_inventory to load vars for managed-node3 10215 1727204055.69515: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204055.69531: Calling all_plugins_play to load vars for managed-node3 10215 1727204055.69535: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204055.69540: Calling groups_plugins_play to load vars for managed-node3 10215 1727204055.72056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204055.74963: done with get_vars() 10215 1727204055.75015: done getting variables 10215 1727204055.75106: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204055.75251: variable 'profile' from source: include params 10215 1727204055.75256: variable 'item' from source: include params 10215 1727204055.75345: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.083) 0:00:24.320 ***** 10215 1727204055.75392: entering _queue_task() for managed-node3/assert 10215 1727204055.75786: worker is 1 (out of 1 available) 10215 1727204055.75907: exiting _queue_task() for managed-node3/assert 10215 1727204055.75920: done queuing things up, now waiting for results queue to drain 10215 1727204055.75922: waiting for pending results... 10215 1727204055.76241: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0.0' 10215 1727204055.76307: in run() - task 12b410aa-8751-3c74-8f8e-000000000267 10215 1727204055.76312: variable 'ansible_search_path' from source: unknown 10215 1727204055.76335: variable 'ansible_search_path' from source: unknown 10215 1727204055.76371: calling self._execute() 10215 1727204055.76524: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.76527: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.76530: variable 'omit' from source: magic vars 10215 1727204055.76999: variable 'ansible_distribution_major_version' from source: facts 10215 1727204055.77019: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204055.77031: variable 'omit' from source: magic vars 10215 1727204055.77174: variable 'omit' from source: magic vars 10215 1727204055.77236: variable 'profile' from source: include params 10215 1727204055.77246: variable 'item' from source: include params 10215 1727204055.77336: variable 'item' from source: include params 10215 1727204055.77362: variable 'omit' from source: magic vars 10215 1727204055.77420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204055.77472: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204055.77508: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204055.77542: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204055.77617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204055.77620: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204055.77622: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.77625: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.77758: Set connection var ansible_connection to ssh 10215 1727204055.77772: Set connection var ansible_pipelining to False 10215 1727204055.77782: Set connection var ansible_shell_type to sh 10215 1727204055.77796: Set connection var ansible_timeout to 10 10215 1727204055.77807: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204055.77822: Set connection var ansible_shell_executable to /bin/sh 10215 1727204055.77854: variable 'ansible_shell_executable' from source: unknown 10215 1727204055.77869: variable 'ansible_connection' from source: unknown 10215 1727204055.77877: variable 'ansible_module_compression' from source: unknown 10215 1727204055.77969: variable 'ansible_shell_type' from source: unknown 10215 1727204055.77972: variable 'ansible_shell_executable' from source: unknown 10215 1727204055.77975: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.77977: variable 'ansible_pipelining' from source: unknown 10215 1727204055.77980: variable 'ansible_timeout' from source: unknown 10215 1727204055.77982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.78096: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204055.78119: variable 'omit' from source: magic vars 10215 1727204055.78129: starting attempt loop 10215 1727204055.78137: running the handler 10215 1727204055.78283: variable 'lsr_net_profile_exists' from source: set_fact 10215 1727204055.78315: Evaluated conditional (lsr_net_profile_exists): True 10215 1727204055.78321: handler run complete 10215 1727204055.78395: attempt loop complete, returning result 10215 1727204055.78398: _execute() done 10215 1727204055.78401: dumping result to json 10215 1727204055.78403: done dumping result, returning 10215 1727204055.78405: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0.0' [12b410aa-8751-3c74-8f8e-000000000267] 10215 1727204055.78408: sending task result for task 12b410aa-8751-3c74-8f8e-000000000267 ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10215 1727204055.78746: no more pending results, returning what we have 10215 1727204055.78750: results queue empty 10215 1727204055.78752: checking for any_errors_fatal 10215 1727204055.78758: done checking for any_errors_fatal 10215 1727204055.78759: checking for max_fail_percentage 10215 1727204055.78761: done checking for max_fail_percentage 10215 1727204055.78762: checking to see if all hosts have failed and the running result is not ok 10215 1727204055.78763: done checking to see if all hosts have failed 10215 1727204055.78764: getting the remaining hosts for this loop 10215 1727204055.78766: done getting the remaining hosts for this loop 10215 1727204055.78771: getting the next task for host managed-node3 10215 1727204055.78778: done getting next task for host managed-node3 10215 1727204055.78781: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 10215 1727204055.78784: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204055.78790: getting variables 10215 1727204055.78792: in VariableManager get_vars() 10215 1727204055.78842: Calling all_inventory to load vars for managed-node3 10215 1727204055.78845: Calling groups_inventory to load vars for managed-node3 10215 1727204055.78849: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204055.78863: Calling all_plugins_play to load vars for managed-node3 10215 1727204055.78867: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204055.78871: Calling groups_plugins_play to load vars for managed-node3 10215 1727204055.79406: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000267 10215 1727204055.79410: WORKER PROCESS EXITING 10215 1727204055.81435: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204055.84483: done with get_vars() 10215 1727204055.84537: done getting variables 10215 1727204055.84613: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204055.84761: variable 'profile' from source: include params 10215 1727204055.84766: variable 'item' from source: include params 10215 1727204055.84842: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.094) 0:00:24.416 ***** 10215 1727204055.84892: entering _queue_task() for managed-node3/assert 10215 1727204055.85270: worker is 1 (out of 1 available) 10215 1727204055.85286: exiting _queue_task() for managed-node3/assert 10215 1727204055.85415: done queuing things up, now waiting for results queue to drain 10215 1727204055.85417: waiting for pending results... 10215 1727204055.85619: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' 10215 1727204055.85766: in run() - task 12b410aa-8751-3c74-8f8e-000000000268 10215 1727204055.85791: variable 'ansible_search_path' from source: unknown 10215 1727204055.85800: variable 'ansible_search_path' from source: unknown 10215 1727204055.85853: calling self._execute() 10215 1727204055.85977: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.85993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.86011: variable 'omit' from source: magic vars 10215 1727204055.86475: variable 'ansible_distribution_major_version' from source: facts 10215 1727204055.86500: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204055.86517: variable 'omit' from source: magic vars 10215 1727204055.86577: variable 'omit' from source: magic vars 10215 1727204055.86737: variable 'profile' from source: include params 10215 1727204055.86749: variable 'item' from source: include params 10215 1727204055.86841: variable 'item' from source: include params 10215 1727204055.86867: variable 'omit' from source: magic vars 10215 1727204055.86935: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204055.87045: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204055.87050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204055.87053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204055.87062: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204055.87109: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204055.87119: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.87128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.87268: Set connection var ansible_connection to ssh 10215 1727204055.87293: Set connection var ansible_pipelining to False 10215 1727204055.87307: Set connection var ansible_shell_type to sh 10215 1727204055.87320: Set connection var ansible_timeout to 10 10215 1727204055.87372: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204055.87376: Set connection var ansible_shell_executable to /bin/sh 10215 1727204055.87378: variable 'ansible_shell_executable' from source: unknown 10215 1727204055.87389: variable 'ansible_connection' from source: unknown 10215 1727204055.87398: variable 'ansible_module_compression' from source: unknown 10215 1727204055.87404: variable 'ansible_shell_type' from source: unknown 10215 1727204055.87410: variable 'ansible_shell_executable' from source: unknown 10215 1727204055.87416: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.87423: variable 'ansible_pipelining' from source: unknown 10215 1727204055.87429: variable 'ansible_timeout' from source: unknown 10215 1727204055.87480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.87614: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204055.87635: variable 'omit' from source: magic vars 10215 1727204055.87646: starting attempt loop 10215 1727204055.87654: running the handler 10215 1727204055.87825: variable 'lsr_net_profile_ansible_managed' from source: set_fact 10215 1727204055.87838: Evaluated conditional (lsr_net_profile_ansible_managed): True 10215 1727204055.87851: handler run complete 10215 1727204055.87874: attempt loop complete, returning result 10215 1727204055.87917: _execute() done 10215 1727204055.87921: dumping result to json 10215 1727204055.87925: done dumping result, returning 10215 1727204055.87927: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0.0' [12b410aa-8751-3c74-8f8e-000000000268] 10215 1727204055.87930: sending task result for task 12b410aa-8751-3c74-8f8e-000000000268 10215 1727204055.88098: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000268 10215 1727204055.88102: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10215 1727204055.88170: no more pending results, returning what we have 10215 1727204055.88174: results queue empty 10215 1727204055.88175: checking for any_errors_fatal 10215 1727204055.88184: done checking for any_errors_fatal 10215 1727204055.88185: checking for max_fail_percentage 10215 1727204055.88188: done checking for max_fail_percentage 10215 1727204055.88191: checking to see if all hosts have failed and the running result is not ok 10215 1727204055.88192: done checking to see if all hosts have failed 10215 1727204055.88193: getting the remaining hosts for this loop 10215 1727204055.88195: done getting the remaining hosts for this loop 10215 1727204055.88201: getting the next task for host managed-node3 10215 1727204055.88209: done getting next task for host managed-node3 10215 1727204055.88212: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 10215 1727204055.88216: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204055.88221: getting variables 10215 1727204055.88223: in VariableManager get_vars() 10215 1727204055.88278: Calling all_inventory to load vars for managed-node3 10215 1727204055.88283: Calling groups_inventory to load vars for managed-node3 10215 1727204055.88286: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204055.88517: Calling all_plugins_play to load vars for managed-node3 10215 1727204055.88522: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204055.88526: Calling groups_plugins_play to load vars for managed-node3 10215 1727204055.90973: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204055.94033: done with get_vars() 10215 1727204055.94085: done getting variables 10215 1727204055.94166: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204055.94306: variable 'profile' from source: include params 10215 1727204055.94310: variable 'item' from source: include params 10215 1727204055.94388: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:15 -0400 (0:00:00.095) 0:00:24.511 ***** 10215 1727204055.94433: entering _queue_task() for managed-node3/assert 10215 1727204055.95004: worker is 1 (out of 1 available) 10215 1727204055.95015: exiting _queue_task() for managed-node3/assert 10215 1727204055.95028: done queuing things up, now waiting for results queue to drain 10215 1727204055.95029: waiting for pending results... 10215 1727204055.95274: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0.0 10215 1727204055.95320: in run() - task 12b410aa-8751-3c74-8f8e-000000000269 10215 1727204055.95343: variable 'ansible_search_path' from source: unknown 10215 1727204055.95352: variable 'ansible_search_path' from source: unknown 10215 1727204055.95478: calling self._execute() 10215 1727204055.95528: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.95543: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.95560: variable 'omit' from source: magic vars 10215 1727204055.96034: variable 'ansible_distribution_major_version' from source: facts 10215 1727204055.96054: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204055.96069: variable 'omit' from source: magic vars 10215 1727204055.96137: variable 'omit' from source: magic vars 10215 1727204055.96266: variable 'profile' from source: include params 10215 1727204055.96275: variable 'item' from source: include params 10215 1727204055.96367: variable 'item' from source: include params 10215 1727204055.96528: variable 'omit' from source: magic vars 10215 1727204055.96633: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204055.96693: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204055.96731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204055.96761: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204055.96839: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204055.96843: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204055.96845: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.96850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.96999: Set connection var ansible_connection to ssh 10215 1727204055.97015: Set connection var ansible_pipelining to False 10215 1727204055.97028: Set connection var ansible_shell_type to sh 10215 1727204055.97042: Set connection var ansible_timeout to 10 10215 1727204055.97060: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204055.97076: Set connection var ansible_shell_executable to /bin/sh 10215 1727204055.97118: variable 'ansible_shell_executable' from source: unknown 10215 1727204055.97166: variable 'ansible_connection' from source: unknown 10215 1727204055.97169: variable 'ansible_module_compression' from source: unknown 10215 1727204055.97171: variable 'ansible_shell_type' from source: unknown 10215 1727204055.97174: variable 'ansible_shell_executable' from source: unknown 10215 1727204055.97176: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204055.97178: variable 'ansible_pipelining' from source: unknown 10215 1727204055.97181: variable 'ansible_timeout' from source: unknown 10215 1727204055.97183: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204055.97369: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204055.97400: variable 'omit' from source: magic vars 10215 1727204055.97434: starting attempt loop 10215 1727204055.97437: running the handler 10215 1727204055.97594: variable 'lsr_net_profile_fingerprint' from source: set_fact 10215 1727204055.97653: Evaluated conditional (lsr_net_profile_fingerprint): True 10215 1727204055.97656: handler run complete 10215 1727204055.97658: attempt loop complete, returning result 10215 1727204055.97660: _execute() done 10215 1727204055.97663: dumping result to json 10215 1727204055.97665: done dumping result, returning 10215 1727204055.97678: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0.0 [12b410aa-8751-3c74-8f8e-000000000269] 10215 1727204055.97693: sending task result for task 12b410aa-8751-3c74-8f8e-000000000269 10215 1727204055.97945: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000269 10215 1727204055.97949: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10215 1727204055.98018: no more pending results, returning what we have 10215 1727204055.98022: results queue empty 10215 1727204055.98023: checking for any_errors_fatal 10215 1727204055.98031: done checking for any_errors_fatal 10215 1727204055.98031: checking for max_fail_percentage 10215 1727204055.98033: done checking for max_fail_percentage 10215 1727204055.98034: checking to see if all hosts have failed and the running result is not ok 10215 1727204055.98035: done checking to see if all hosts have failed 10215 1727204055.98036: getting the remaining hosts for this loop 10215 1727204055.98038: done getting the remaining hosts for this loop 10215 1727204055.98042: getting the next task for host managed-node3 10215 1727204055.98052: done getting next task for host managed-node3 10215 1727204055.98055: ^ task is: TASK: Include the task 'get_profile_stat.yml' 10215 1727204055.98279: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204055.98285: getting variables 10215 1727204055.98287: in VariableManager get_vars() 10215 1727204055.98333: Calling all_inventory to load vars for managed-node3 10215 1727204055.98337: Calling groups_inventory to load vars for managed-node3 10215 1727204055.98341: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204055.98354: Calling all_plugins_play to load vars for managed-node3 10215 1727204055.98357: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204055.98361: Calling groups_plugins_play to load vars for managed-node3 10215 1727204056.02228: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204056.05650: done with get_vars() 10215 1727204056.05803: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.116) 0:00:24.627 ***** 10215 1727204056.06040: entering _queue_task() for managed-node3/include_tasks 10215 1727204056.06857: worker is 1 (out of 1 available) 10215 1727204056.06873: exiting _queue_task() for managed-node3/include_tasks 10215 1727204056.06887: done queuing things up, now waiting for results queue to drain 10215 1727204056.06892: waiting for pending results... 10215 1727204056.07693: running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' 10215 1727204056.07808: in run() - task 12b410aa-8751-3c74-8f8e-00000000026d 10215 1727204056.08000: variable 'ansible_search_path' from source: unknown 10215 1727204056.08004: variable 'ansible_search_path' from source: unknown 10215 1727204056.08007: calling self._execute() 10215 1727204056.08171: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204056.08229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204056.08248: variable 'omit' from source: magic vars 10215 1727204056.09267: variable 'ansible_distribution_major_version' from source: facts 10215 1727204056.09287: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204056.09415: _execute() done 10215 1727204056.09419: dumping result to json 10215 1727204056.09422: done dumping result, returning 10215 1727204056.09424: done running TaskExecutor() for managed-node3/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-3c74-8f8e-00000000026d] 10215 1727204056.09426: sending task result for task 12b410aa-8751-3c74-8f8e-00000000026d 10215 1727204056.09596: no more pending results, returning what we have 10215 1727204056.09602: in VariableManager get_vars() 10215 1727204056.09660: Calling all_inventory to load vars for managed-node3 10215 1727204056.09664: Calling groups_inventory to load vars for managed-node3 10215 1727204056.09667: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204056.09684: Calling all_plugins_play to load vars for managed-node3 10215 1727204056.09690: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204056.09695: Calling groups_plugins_play to load vars for managed-node3 10215 1727204056.10640: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000026d 10215 1727204056.10644: WORKER PROCESS EXITING 10215 1727204056.25587: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204056.28616: done with get_vars() 10215 1727204056.28661: variable 'ansible_search_path' from source: unknown 10215 1727204056.28668: variable 'ansible_search_path' from source: unknown 10215 1727204056.28717: we have included files to process 10215 1727204056.28718: generating all_blocks data 10215 1727204056.28720: done generating all_blocks data 10215 1727204056.28724: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10215 1727204056.28726: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10215 1727204056.28728: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 10215 1727204056.30105: done processing included file 10215 1727204056.30108: iterating over new_blocks loaded from include file 10215 1727204056.30110: in VariableManager get_vars() 10215 1727204056.30139: done with get_vars() 10215 1727204056.30142: filtering new block on tags 10215 1727204056.30176: done filtering new block on tags 10215 1727204056.30180: in VariableManager get_vars() 10215 1727204056.30210: done with get_vars() 10215 1727204056.30213: filtering new block on tags 10215 1727204056.30244: done filtering new block on tags 10215 1727204056.30247: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node3 10215 1727204056.30253: extending task lists for all hosts with included blocks 10215 1727204056.30838: done extending task lists 10215 1727204056.30840: done processing included files 10215 1727204056.30841: results queue empty 10215 1727204056.30841: checking for any_errors_fatal 10215 1727204056.30846: done checking for any_errors_fatal 10215 1727204056.30847: checking for max_fail_percentage 10215 1727204056.30848: done checking for max_fail_percentage 10215 1727204056.30849: checking to see if all hosts have failed and the running result is not ok 10215 1727204056.30850: done checking to see if all hosts have failed 10215 1727204056.30851: getting the remaining hosts for this loop 10215 1727204056.30853: done getting the remaining hosts for this loop 10215 1727204056.30856: getting the next task for host managed-node3 10215 1727204056.30861: done getting next task for host managed-node3 10215 1727204056.30863: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 10215 1727204056.30867: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204056.30871: getting variables 10215 1727204056.30872: in VariableManager get_vars() 10215 1727204056.30892: Calling all_inventory to load vars for managed-node3 10215 1727204056.30895: Calling groups_inventory to load vars for managed-node3 10215 1727204056.30898: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204056.30905: Calling all_plugins_play to load vars for managed-node3 10215 1727204056.30908: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204056.30912: Calling groups_plugins_play to load vars for managed-node3 10215 1727204056.33216: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204056.37142: done with get_vars() 10215 1727204056.37194: done getting variables 10215 1727204056.37257: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.312) 0:00:24.940 ***** 10215 1727204056.37303: entering _queue_task() for managed-node3/set_fact 10215 1727204056.37910: worker is 1 (out of 1 available) 10215 1727204056.37925: exiting _queue_task() for managed-node3/set_fact 10215 1727204056.37944: done queuing things up, now waiting for results queue to drain 10215 1727204056.37946: waiting for pending results... 10215 1727204056.38165: running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag 10215 1727204056.38333: in run() - task 12b410aa-8751-3c74-8f8e-000000000440 10215 1727204056.38355: variable 'ansible_search_path' from source: unknown 10215 1727204056.38363: variable 'ansible_search_path' from source: unknown 10215 1727204056.38417: calling self._execute() 10215 1727204056.38538: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204056.38552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204056.38570: variable 'omit' from source: magic vars 10215 1727204056.39054: variable 'ansible_distribution_major_version' from source: facts 10215 1727204056.39075: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204056.39088: variable 'omit' from source: magic vars 10215 1727204056.39179: variable 'omit' from source: magic vars 10215 1727204056.39236: variable 'omit' from source: magic vars 10215 1727204056.39296: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204056.39346: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204056.39382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204056.39416: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204056.39435: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204056.39795: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204056.39801: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204056.39804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204056.39854: Set connection var ansible_connection to ssh 10215 1727204056.39909: Set connection var ansible_pipelining to False 10215 1727204056.39929: Set connection var ansible_shell_type to sh 10215 1727204056.40142: Set connection var ansible_timeout to 10 10215 1727204056.40145: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204056.40148: Set connection var ansible_shell_executable to /bin/sh 10215 1727204056.40150: variable 'ansible_shell_executable' from source: unknown 10215 1727204056.40153: variable 'ansible_connection' from source: unknown 10215 1727204056.40155: variable 'ansible_module_compression' from source: unknown 10215 1727204056.40157: variable 'ansible_shell_type' from source: unknown 10215 1727204056.40159: variable 'ansible_shell_executable' from source: unknown 10215 1727204056.40161: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204056.40163: variable 'ansible_pipelining' from source: unknown 10215 1727204056.40165: variable 'ansible_timeout' from source: unknown 10215 1727204056.40167: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204056.40529: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204056.40551: variable 'omit' from source: magic vars 10215 1727204056.40591: starting attempt loop 10215 1727204056.40698: running the handler 10215 1727204056.40796: handler run complete 10215 1727204056.40799: attempt loop complete, returning result 10215 1727204056.40803: _execute() done 10215 1727204056.40806: dumping result to json 10215 1727204056.40810: done dumping result, returning 10215 1727204056.40813: done running TaskExecutor() for managed-node3/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-3c74-8f8e-000000000440] 10215 1727204056.40816: sending task result for task 12b410aa-8751-3c74-8f8e-000000000440 10215 1727204056.40885: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000440 ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 10215 1727204056.40968: no more pending results, returning what we have 10215 1727204056.40972: results queue empty 10215 1727204056.40973: checking for any_errors_fatal 10215 1727204056.40975: done checking for any_errors_fatal 10215 1727204056.40977: checking for max_fail_percentage 10215 1727204056.40978: done checking for max_fail_percentage 10215 1727204056.40980: checking to see if all hosts have failed and the running result is not ok 10215 1727204056.40981: done checking to see if all hosts have failed 10215 1727204056.40982: getting the remaining hosts for this loop 10215 1727204056.40984: done getting the remaining hosts for this loop 10215 1727204056.40992: getting the next task for host managed-node3 10215 1727204056.41058: done getting next task for host managed-node3 10215 1727204056.41062: ^ task is: TASK: Stat profile file 10215 1727204056.41067: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204056.41073: getting variables 10215 1727204056.41075: in VariableManager get_vars() 10215 1727204056.41228: Calling all_inventory to load vars for managed-node3 10215 1727204056.41232: Calling groups_inventory to load vars for managed-node3 10215 1727204056.41235: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204056.41251: Calling all_plugins_play to load vars for managed-node3 10215 1727204056.41254: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204056.41258: Calling groups_plugins_play to load vars for managed-node3 10215 1727204056.41832: WORKER PROCESS EXITING 10215 1727204056.43802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204056.49111: done with get_vars() 10215 1727204056.49163: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.122) 0:00:25.062 ***** 10215 1727204056.49511: entering _queue_task() for managed-node3/stat 10215 1727204056.50227: worker is 1 (out of 1 available) 10215 1727204056.50238: exiting _queue_task() for managed-node3/stat 10215 1727204056.50251: done queuing things up, now waiting for results queue to drain 10215 1727204056.50253: waiting for pending results... 10215 1727204056.50497: running TaskExecutor() for managed-node3/TASK: Stat profile file 10215 1727204056.50566: in run() - task 12b410aa-8751-3c74-8f8e-000000000441 10215 1727204056.50598: variable 'ansible_search_path' from source: unknown 10215 1727204056.50612: variable 'ansible_search_path' from source: unknown 10215 1727204056.50660: calling self._execute() 10215 1727204056.50782: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204056.50797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204056.50826: variable 'omit' from source: magic vars 10215 1727204056.51299: variable 'ansible_distribution_major_version' from source: facts 10215 1727204056.51322: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204056.51335: variable 'omit' from source: magic vars 10215 1727204056.51410: variable 'omit' from source: magic vars 10215 1727204056.51573: variable 'profile' from source: include params 10215 1727204056.51577: variable 'item' from source: include params 10215 1727204056.51636: variable 'item' from source: include params 10215 1727204056.51678: variable 'omit' from source: magic vars 10215 1727204056.51735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204056.51781: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204056.51901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204056.51904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204056.51909: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204056.51912: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204056.51915: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204056.51923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204056.52072: Set connection var ansible_connection to ssh 10215 1727204056.52090: Set connection var ansible_pipelining to False 10215 1727204056.52106: Set connection var ansible_shell_type to sh 10215 1727204056.52132: Set connection var ansible_timeout to 10 10215 1727204056.52146: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204056.52163: Set connection var ansible_shell_executable to /bin/sh 10215 1727204056.52200: variable 'ansible_shell_executable' from source: unknown 10215 1727204056.52212: variable 'ansible_connection' from source: unknown 10215 1727204056.52227: variable 'ansible_module_compression' from source: unknown 10215 1727204056.52236: variable 'ansible_shell_type' from source: unknown 10215 1727204056.52243: variable 'ansible_shell_executable' from source: unknown 10215 1727204056.52251: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204056.52259: variable 'ansible_pipelining' from source: unknown 10215 1727204056.52266: variable 'ansible_timeout' from source: unknown 10215 1727204056.52275: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204056.52529: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204056.52594: variable 'omit' from source: magic vars 10215 1727204056.52598: starting attempt loop 10215 1727204056.52600: running the handler 10215 1727204056.52603: _low_level_execute_command(): starting 10215 1727204056.52606: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204056.53357: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204056.53380: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204056.53442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204056.53525: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204056.53560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204056.53598: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204056.53657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204056.55418: stdout chunk (state=3): >>>/root <<< 10215 1727204056.55596: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204056.55602: stdout chunk (state=3): >>><<< 10215 1727204056.55630: stderr chunk (state=3): >>><<< 10215 1727204056.55761: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204056.55765: _low_level_execute_command(): starting 10215 1727204056.55768: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222 `" && echo ansible-tmp-1727204056.556553-11630-6917527572222="` echo /root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222 `" ) && sleep 0' 10215 1727204056.56323: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204056.56346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204056.56404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204056.56472: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204056.56493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204056.56513: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204056.56657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204056.58672: stdout chunk (state=3): >>>ansible-tmp-1727204056.556553-11630-6917527572222=/root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222 <<< 10215 1727204056.58879: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204056.58883: stdout chunk (state=3): >>><<< 10215 1727204056.58885: stderr chunk (state=3): >>><<< 10215 1727204056.59098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204056.556553-11630-6917527572222=/root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204056.59102: variable 'ansible_module_compression' from source: unknown 10215 1727204056.59105: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 10215 1727204056.59107: variable 'ansible_facts' from source: unknown 10215 1727204056.59172: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222/AnsiballZ_stat.py 10215 1727204056.59406: Sending initial data 10215 1727204056.59420: Sent initial data (150 bytes) 10215 1727204056.60005: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204056.60021: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204056.60038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204056.60058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204056.60108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204056.60129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204056.60211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204056.60232: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204056.60249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204056.60278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204056.60356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204056.62035: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204056.62079: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204056.62142: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpyu5qtcdt /root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222/AnsiballZ_stat.py <<< 10215 1727204056.62146: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222/AnsiballZ_stat.py" <<< 10215 1727204056.62193: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpyu5qtcdt" to remote "/root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222/AnsiballZ_stat.py" <<< 10215 1727204056.63286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204056.63352: stderr chunk (state=3): >>><<< 10215 1727204056.63362: stdout chunk (state=3): >>><<< 10215 1727204056.63408: done transferring module to remote 10215 1727204056.63491: _low_level_execute_command(): starting 10215 1727204056.63497: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222/ /root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222/AnsiballZ_stat.py && sleep 0' 10215 1727204056.64117: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204056.64133: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204056.64169: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204056.64286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204056.64320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204056.64379: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204056.66342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204056.66346: stdout chunk (state=3): >>><<< 10215 1727204056.66348: stderr chunk (state=3): >>><<< 10215 1727204056.66395: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204056.66399: _low_level_execute_command(): starting 10215 1727204056.66402: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222/AnsiballZ_stat.py && sleep 0' 10215 1727204056.67336: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204056.67394: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204056.67431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204056.67512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204056.67541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204056.67563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204056.67759: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204056.85313: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 10215 1727204056.86823: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204056.86834: stdout chunk (state=3): >>><<< 10215 1727204056.86848: stderr chunk (state=3): >>><<< 10215 1727204056.86874: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204056.86996: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204056.87000: _low_level_execute_command(): starting 10215 1727204056.87002: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204056.556553-11630-6917527572222/ > /dev/null 2>&1 && sleep 0' 10215 1727204056.87585: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204056.87604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204056.87619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204056.87639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204056.87656: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204056.87670: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204056.87687: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204056.87791: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204056.87819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204056.87880: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204056.89894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204056.89898: stdout chunk (state=3): >>><<< 10215 1727204056.89908: stderr chunk (state=3): >>><<< 10215 1727204056.89928: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204056.89935: handler run complete 10215 1727204056.89967: attempt loop complete, returning result 10215 1727204056.89970: _execute() done 10215 1727204056.89973: dumping result to json 10215 1727204056.89979: done dumping result, returning 10215 1727204056.89991: done running TaskExecutor() for managed-node3/TASK: Stat profile file [12b410aa-8751-3c74-8f8e-000000000441] 10215 1727204056.89998: sending task result for task 12b410aa-8751-3c74-8f8e-000000000441 10215 1727204056.90284: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000441 10215 1727204056.90287: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } 10215 1727204056.90385: no more pending results, returning what we have 10215 1727204056.90392: results queue empty 10215 1727204056.90393: checking for any_errors_fatal 10215 1727204056.90400: done checking for any_errors_fatal 10215 1727204056.90401: checking for max_fail_percentage 10215 1727204056.90403: done checking for max_fail_percentage 10215 1727204056.90404: checking to see if all hosts have failed and the running result is not ok 10215 1727204056.90405: done checking to see if all hosts have failed 10215 1727204056.90406: getting the remaining hosts for this loop 10215 1727204056.90410: done getting the remaining hosts for this loop 10215 1727204056.90415: getting the next task for host managed-node3 10215 1727204056.90423: done getting next task for host managed-node3 10215 1727204056.90425: ^ task is: TASK: Set NM profile exist flag based on the profile files 10215 1727204056.90429: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204056.90434: getting variables 10215 1727204056.90436: in VariableManager get_vars() 10215 1727204056.90484: Calling all_inventory to load vars for managed-node3 10215 1727204056.90488: Calling groups_inventory to load vars for managed-node3 10215 1727204056.90620: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204056.90635: Calling all_plugins_play to load vars for managed-node3 10215 1727204056.90639: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204056.90643: Calling groups_plugins_play to load vars for managed-node3 10215 1727204056.93287: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204056.96303: done with get_vars() 10215 1727204056.96338: done getting variables 10215 1727204056.96412: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:16 -0400 (0:00:00.469) 0:00:25.531 ***** 10215 1727204056.96448: entering _queue_task() for managed-node3/set_fact 10215 1727204056.96781: worker is 1 (out of 1 available) 10215 1727204056.96996: exiting _queue_task() for managed-node3/set_fact 10215 1727204056.97009: done queuing things up, now waiting for results queue to drain 10215 1727204056.97011: waiting for pending results... 10215 1727204056.97106: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files 10215 1727204056.97251: in run() - task 12b410aa-8751-3c74-8f8e-000000000442 10215 1727204056.97276: variable 'ansible_search_path' from source: unknown 10215 1727204056.97285: variable 'ansible_search_path' from source: unknown 10215 1727204056.97339: calling self._execute() 10215 1727204056.97464: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204056.97479: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204056.97501: variable 'omit' from source: magic vars 10215 1727204056.97971: variable 'ansible_distribution_major_version' from source: facts 10215 1727204056.97996: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204056.98158: variable 'profile_stat' from source: set_fact 10215 1727204056.98182: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204056.98216: when evaluation is False, skipping this task 10215 1727204056.98219: _execute() done 10215 1727204056.98222: dumping result to json 10215 1727204056.98225: done dumping result, returning 10215 1727204056.98326: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-3c74-8f8e-000000000442] 10215 1727204056.98330: sending task result for task 12b410aa-8751-3c74-8f8e-000000000442 10215 1727204056.98407: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000442 10215 1727204056.98411: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204056.98484: no more pending results, returning what we have 10215 1727204056.98491: results queue empty 10215 1727204056.98493: checking for any_errors_fatal 10215 1727204056.98505: done checking for any_errors_fatal 10215 1727204056.98507: checking for max_fail_percentage 10215 1727204056.98508: done checking for max_fail_percentage 10215 1727204056.98510: checking to see if all hosts have failed and the running result is not ok 10215 1727204056.98511: done checking to see if all hosts have failed 10215 1727204056.98512: getting the remaining hosts for this loop 10215 1727204056.98514: done getting the remaining hosts for this loop 10215 1727204056.98519: getting the next task for host managed-node3 10215 1727204056.98529: done getting next task for host managed-node3 10215 1727204056.98532: ^ task is: TASK: Get NM profile info 10215 1727204056.98537: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204056.98544: getting variables 10215 1727204056.98546: in VariableManager get_vars() 10215 1727204056.98700: Calling all_inventory to load vars for managed-node3 10215 1727204056.98704: Calling groups_inventory to load vars for managed-node3 10215 1727204056.98707: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204056.98722: Calling all_plugins_play to load vars for managed-node3 10215 1727204056.98727: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204056.98731: Calling groups_plugins_play to load vars for managed-node3 10215 1727204057.01088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204057.04056: done with get_vars() 10215 1727204057.04097: done getting variables 10215 1727204057.04170: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.077) 0:00:25.609 ***** 10215 1727204057.04211: entering _queue_task() for managed-node3/shell 10215 1727204057.04588: worker is 1 (out of 1 available) 10215 1727204057.04803: exiting _queue_task() for managed-node3/shell 10215 1727204057.04813: done queuing things up, now waiting for results queue to drain 10215 1727204057.04815: waiting for pending results... 10215 1727204057.04947: running TaskExecutor() for managed-node3/TASK: Get NM profile info 10215 1727204057.05071: in run() - task 12b410aa-8751-3c74-8f8e-000000000443 10215 1727204057.05151: variable 'ansible_search_path' from source: unknown 10215 1727204057.05155: variable 'ansible_search_path' from source: unknown 10215 1727204057.05158: calling self._execute() 10215 1727204057.05268: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204057.05283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204057.05303: variable 'omit' from source: magic vars 10215 1727204057.05752: variable 'ansible_distribution_major_version' from source: facts 10215 1727204057.05773: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204057.05786: variable 'omit' from source: magic vars 10215 1727204057.05863: variable 'omit' from source: magic vars 10215 1727204057.06025: variable 'profile' from source: include params 10215 1727204057.06029: variable 'item' from source: include params 10215 1727204057.06103: variable 'item' from source: include params 10215 1727204057.06243: variable 'omit' from source: magic vars 10215 1727204057.06247: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204057.06250: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204057.06269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204057.06299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204057.06321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204057.06367: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204057.06377: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204057.06387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204057.06555: Set connection var ansible_connection to ssh 10215 1727204057.06571: Set connection var ansible_pipelining to False 10215 1727204057.06584: Set connection var ansible_shell_type to sh 10215 1727204057.06650: Set connection var ansible_timeout to 10 10215 1727204057.06664: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204057.06680: Set connection var ansible_shell_executable to /bin/sh 10215 1727204057.06713: variable 'ansible_shell_executable' from source: unknown 10215 1727204057.06723: variable 'ansible_connection' from source: unknown 10215 1727204057.06731: variable 'ansible_module_compression' from source: unknown 10215 1727204057.06739: variable 'ansible_shell_type' from source: unknown 10215 1727204057.06752: variable 'ansible_shell_executable' from source: unknown 10215 1727204057.06761: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204057.06770: variable 'ansible_pipelining' from source: unknown 10215 1727204057.06859: variable 'ansible_timeout' from source: unknown 10215 1727204057.06863: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204057.07010: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204057.07030: variable 'omit' from source: magic vars 10215 1727204057.07041: starting attempt loop 10215 1727204057.07048: running the handler 10215 1727204057.07065: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204057.07101: _low_level_execute_command(): starting 10215 1727204057.07117: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204057.08006: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204057.08037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204057.08120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204057.09861: stdout chunk (state=3): >>>/root <<< 10215 1727204057.10065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204057.10068: stdout chunk (state=3): >>><<< 10215 1727204057.10071: stderr chunk (state=3): >>><<< 10215 1727204057.10105: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204057.10196: _low_level_execute_command(): starting 10215 1727204057.10201: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466 `" && echo ansible-tmp-1727204057.1011195-11651-139824658844466="` echo /root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466 `" ) && sleep 0' 10215 1727204057.10851: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204057.10871: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204057.10904: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204057.11020: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204057.11049: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204057.11075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204057.11108: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204057.11215: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204057.13258: stdout chunk (state=3): >>>ansible-tmp-1727204057.1011195-11651-139824658844466=/root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466 <<< 10215 1727204057.13595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204057.13598: stdout chunk (state=3): >>><<< 10215 1727204057.13601: stderr chunk (state=3): >>><<< 10215 1727204057.13604: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204057.1011195-11651-139824658844466=/root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204057.13606: variable 'ansible_module_compression' from source: unknown 10215 1727204057.13609: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10215 1727204057.13611: variable 'ansible_facts' from source: unknown 10215 1727204057.13700: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466/AnsiballZ_command.py 10215 1727204057.13885: Sending initial data 10215 1727204057.13891: Sent initial data (156 bytes) 10215 1727204057.14499: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204057.14525: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204057.14609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204057.14648: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204057.14661: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204057.14679: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204057.14743: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204057.16375: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204057.16425: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204057.16496: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmp3z0gqb14 /root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466/AnsiballZ_command.py <<< 10215 1727204057.16520: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmp3z0gqb14" to remote "/root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466/AnsiballZ_command.py" <<< 10215 1727204057.17782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204057.17786: stdout chunk (state=3): >>><<< 10215 1727204057.17788: stderr chunk (state=3): >>><<< 10215 1727204057.17798: done transferring module to remote 10215 1727204057.17801: _low_level_execute_command(): starting 10215 1727204057.17803: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466/ /root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466/AnsiballZ_command.py && sleep 0' 10215 1727204057.18932: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204057.18987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204057.19018: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204057.19065: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204057.19119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204057.21209: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204057.21212: stdout chunk (state=3): >>><<< 10215 1727204057.21215: stderr chunk (state=3): >>><<< 10215 1727204057.21396: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204057.21400: _low_level_execute_command(): starting 10215 1727204057.21403: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466/AnsiballZ_command.py && sleep 0' 10215 1727204057.22715: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204057.22742: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204057.22760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204057.22787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204057.22843: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204057.42623: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:17.402468", "end": "2024-09-24 14:54:17.425200", "delta": "0:00:00.022732", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10215 1727204057.44297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204057.44373: stderr chunk (state=3): >>><<< 10215 1727204057.44378: stdout chunk (state=3): >>><<< 10215 1727204057.44497: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:17.402468", "end": "2024-09-24 14:54:17.425200", "delta": "0:00:00.022732", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204057.44502: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204057.44506: _low_level_execute_command(): starting 10215 1727204057.44518: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204057.1011195-11651-139824658844466/ > /dev/null 2>&1 && sleep 0' 10215 1727204057.45985: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204057.46080: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204057.46181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204057.46348: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204057.46351: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204057.46416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204057.48459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204057.48465: stdout chunk (state=3): >>><<< 10215 1727204057.48596: stderr chunk (state=3): >>><<< 10215 1727204057.48600: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204057.48603: handler run complete 10215 1727204057.48624: Evaluated conditional (False): False 10215 1727204057.48638: attempt loop complete, returning result 10215 1727204057.48641: _execute() done 10215 1727204057.48646: dumping result to json 10215 1727204057.48653: done dumping result, returning 10215 1727204057.48778: done running TaskExecutor() for managed-node3/TASK: Get NM profile info [12b410aa-8751-3c74-8f8e-000000000443] 10215 1727204057.48786: sending task result for task 12b410aa-8751-3c74-8f8e-000000000443 10215 1727204057.49021: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000443 10215 1727204057.49025: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.022732", "end": "2024-09-24 14:54:17.425200", "rc": 0, "start": "2024-09-24 14:54:17.402468" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 10215 1727204057.49128: no more pending results, returning what we have 10215 1727204057.49132: results queue empty 10215 1727204057.49134: checking for any_errors_fatal 10215 1727204057.49143: done checking for any_errors_fatal 10215 1727204057.49144: checking for max_fail_percentage 10215 1727204057.49146: done checking for max_fail_percentage 10215 1727204057.49147: checking to see if all hosts have failed and the running result is not ok 10215 1727204057.49148: done checking to see if all hosts have failed 10215 1727204057.49149: getting the remaining hosts for this loop 10215 1727204057.49151: done getting the remaining hosts for this loop 10215 1727204057.49157: getting the next task for host managed-node3 10215 1727204057.49165: done getting next task for host managed-node3 10215 1727204057.49169: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10215 1727204057.49173: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204057.49178: getting variables 10215 1727204057.49180: in VariableManager get_vars() 10215 1727204057.49546: Calling all_inventory to load vars for managed-node3 10215 1727204057.49550: Calling groups_inventory to load vars for managed-node3 10215 1727204057.49553: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204057.49565: Calling all_plugins_play to load vars for managed-node3 10215 1727204057.49569: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204057.49573: Calling groups_plugins_play to load vars for managed-node3 10215 1727204057.54311: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204057.60684: done with get_vars() 10215 1727204057.60733: done getting variables 10215 1727204057.60852: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.567) 0:00:26.177 ***** 10215 1727204057.61011: entering _queue_task() for managed-node3/set_fact 10215 1727204057.61793: worker is 1 (out of 1 available) 10215 1727204057.61810: exiting _queue_task() for managed-node3/set_fact 10215 1727204057.61823: done queuing things up, now waiting for results queue to drain 10215 1727204057.61825: waiting for pending results... 10215 1727204057.62355: running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 10215 1727204057.62628: in run() - task 12b410aa-8751-3c74-8f8e-000000000444 10215 1727204057.62703: variable 'ansible_search_path' from source: unknown 10215 1727204057.62713: variable 'ansible_search_path' from source: unknown 10215 1727204057.62780: calling self._execute() 10215 1727204057.63064: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204057.63099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204057.63144: variable 'omit' from source: magic vars 10215 1727204057.64099: variable 'ansible_distribution_major_version' from source: facts 10215 1727204057.64120: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204057.64445: variable 'nm_profile_exists' from source: set_fact 10215 1727204057.64497: Evaluated conditional (nm_profile_exists.rc == 0): True 10215 1727204057.64898: variable 'omit' from source: magic vars 10215 1727204057.64901: variable 'omit' from source: magic vars 10215 1727204057.65195: variable 'omit' from source: magic vars 10215 1727204057.65198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204057.65794: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204057.65798: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204057.65801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204057.65803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204057.65805: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204057.65808: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204057.65810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204057.66018: Set connection var ansible_connection to ssh 10215 1727204057.66032: Set connection var ansible_pipelining to False 10215 1727204057.66044: Set connection var ansible_shell_type to sh 10215 1727204057.66056: Set connection var ansible_timeout to 10 10215 1727204057.66067: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204057.66082: Set connection var ansible_shell_executable to /bin/sh 10215 1727204057.66114: variable 'ansible_shell_executable' from source: unknown 10215 1727204057.66124: variable 'ansible_connection' from source: unknown 10215 1727204057.66132: variable 'ansible_module_compression' from source: unknown 10215 1727204057.66140: variable 'ansible_shell_type' from source: unknown 10215 1727204057.66148: variable 'ansible_shell_executable' from source: unknown 10215 1727204057.66155: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204057.66165: variable 'ansible_pipelining' from source: unknown 10215 1727204057.66172: variable 'ansible_timeout' from source: unknown 10215 1727204057.66182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204057.66561: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204057.66580: variable 'omit' from source: magic vars 10215 1727204057.66593: starting attempt loop 10215 1727204057.66602: running the handler 10215 1727204057.66623: handler run complete 10215 1727204057.66894: attempt loop complete, returning result 10215 1727204057.66898: _execute() done 10215 1727204057.66900: dumping result to json 10215 1727204057.66902: done dumping result, returning 10215 1727204057.66905: done running TaskExecutor() for managed-node3/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-3c74-8f8e-000000000444] 10215 1727204057.66907: sending task result for task 12b410aa-8751-3c74-8f8e-000000000444 10215 1727204057.66980: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000444 10215 1727204057.66984: WORKER PROCESS EXITING ok: [managed-node3] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 10215 1727204057.67048: no more pending results, returning what we have 10215 1727204057.67051: results queue empty 10215 1727204057.67052: checking for any_errors_fatal 10215 1727204057.67062: done checking for any_errors_fatal 10215 1727204057.67062: checking for max_fail_percentage 10215 1727204057.67064: done checking for max_fail_percentage 10215 1727204057.67065: checking to see if all hosts have failed and the running result is not ok 10215 1727204057.67066: done checking to see if all hosts have failed 10215 1727204057.67067: getting the remaining hosts for this loop 10215 1727204057.67069: done getting the remaining hosts for this loop 10215 1727204057.67074: getting the next task for host managed-node3 10215 1727204057.67086: done getting next task for host managed-node3 10215 1727204057.67089: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 10215 1727204057.67094: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204057.67100: getting variables 10215 1727204057.67102: in VariableManager get_vars() 10215 1727204057.67151: Calling all_inventory to load vars for managed-node3 10215 1727204057.67154: Calling groups_inventory to load vars for managed-node3 10215 1727204057.67157: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204057.67170: Calling all_plugins_play to load vars for managed-node3 10215 1727204057.67173: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204057.67177: Calling groups_plugins_play to load vars for managed-node3 10215 1727204057.72227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204057.79029: done with get_vars() 10215 1727204057.79082: done getting variables 10215 1727204057.79280: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204057.79854: variable 'profile' from source: include params 10215 1727204057.79859: variable 'item' from source: include params 10215 1727204057.80152: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.191) 0:00:26.369 ***** 10215 1727204057.80198: entering _queue_task() for managed-node3/command 10215 1727204057.81067: worker is 1 (out of 1 available) 10215 1727204057.81081: exiting _queue_task() for managed-node3/command 10215 1727204057.81238: done queuing things up, now waiting for results queue to drain 10215 1727204057.81241: waiting for pending results... 10215 1727204057.81631: running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 10215 1727204057.81950: in run() - task 12b410aa-8751-3c74-8f8e-000000000446 10215 1727204057.82075: variable 'ansible_search_path' from source: unknown 10215 1727204057.82695: variable 'ansible_search_path' from source: unknown 10215 1727204057.82699: calling self._execute() 10215 1727204057.82702: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204057.82705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204057.82709: variable 'omit' from source: magic vars 10215 1727204057.84094: variable 'ansible_distribution_major_version' from source: facts 10215 1727204057.84098: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204057.84101: variable 'profile_stat' from source: set_fact 10215 1727204057.84103: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204057.84106: when evaluation is False, skipping this task 10215 1727204057.84109: _execute() done 10215 1727204057.84111: dumping result to json 10215 1727204057.84299: done dumping result, returning 10215 1727204057.84312: done running TaskExecutor() for managed-node3/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [12b410aa-8751-3c74-8f8e-000000000446] 10215 1727204057.84322: sending task result for task 12b410aa-8751-3c74-8f8e-000000000446 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204057.84488: no more pending results, returning what we have 10215 1727204057.84494: results queue empty 10215 1727204057.84495: checking for any_errors_fatal 10215 1727204057.84597: done checking for any_errors_fatal 10215 1727204057.84599: checking for max_fail_percentage 10215 1727204057.84601: done checking for max_fail_percentage 10215 1727204057.84602: checking to see if all hosts have failed and the running result is not ok 10215 1727204057.84603: done checking to see if all hosts have failed 10215 1727204057.84604: getting the remaining hosts for this loop 10215 1727204057.84606: done getting the remaining hosts for this loop 10215 1727204057.84618: getting the next task for host managed-node3 10215 1727204057.84625: done getting next task for host managed-node3 10215 1727204057.84628: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 10215 1727204057.84632: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204057.84638: getting variables 10215 1727204057.84640: in VariableManager get_vars() 10215 1727204057.84686: Calling all_inventory to load vars for managed-node3 10215 1727204057.84792: Calling groups_inventory to load vars for managed-node3 10215 1727204057.84797: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204057.84815: Calling all_plugins_play to load vars for managed-node3 10215 1727204057.84819: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204057.84824: Calling groups_plugins_play to load vars for managed-node3 10215 1727204057.85419: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000446 10215 1727204057.85424: WORKER PROCESS EXITING 10215 1727204057.89619: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204057.96072: done with get_vars() 10215 1727204057.96127: done getting variables 10215 1727204057.96341: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204057.96692: variable 'profile' from source: include params 10215 1727204057.96698: variable 'item' from source: include params 10215 1727204057.96781: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.166) 0:00:26.535 ***** 10215 1727204057.96822: entering _queue_task() for managed-node3/set_fact 10215 1727204057.97736: worker is 1 (out of 1 available) 10215 1727204057.97752: exiting _queue_task() for managed-node3/set_fact 10215 1727204057.97766: done queuing things up, now waiting for results queue to drain 10215 1727204057.97768: waiting for pending results... 10215 1727204057.98185: running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 10215 1727204057.98422: in run() - task 12b410aa-8751-3c74-8f8e-000000000447 10215 1727204057.98444: variable 'ansible_search_path' from source: unknown 10215 1727204057.98453: variable 'ansible_search_path' from source: unknown 10215 1727204057.98500: calling self._execute() 10215 1727204057.98606: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204057.98620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204057.98636: variable 'omit' from source: magic vars 10215 1727204057.99060: variable 'ansible_distribution_major_version' from source: facts 10215 1727204057.99079: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204057.99239: variable 'profile_stat' from source: set_fact 10215 1727204057.99262: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204057.99271: when evaluation is False, skipping this task 10215 1727204057.99278: _execute() done 10215 1727204057.99286: dumping result to json 10215 1727204057.99296: done dumping result, returning 10215 1727204057.99309: done running TaskExecutor() for managed-node3/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [12b410aa-8751-3c74-8f8e-000000000447] 10215 1727204057.99321: sending task result for task 12b410aa-8751-3c74-8f8e-000000000447 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204057.99487: no more pending results, returning what we have 10215 1727204057.99494: results queue empty 10215 1727204057.99495: checking for any_errors_fatal 10215 1727204057.99503: done checking for any_errors_fatal 10215 1727204057.99504: checking for max_fail_percentage 10215 1727204057.99505: done checking for max_fail_percentage 10215 1727204057.99509: checking to see if all hosts have failed and the running result is not ok 10215 1727204057.99510: done checking to see if all hosts have failed 10215 1727204057.99510: getting the remaining hosts for this loop 10215 1727204057.99512: done getting the remaining hosts for this loop 10215 1727204057.99517: getting the next task for host managed-node3 10215 1727204057.99525: done getting next task for host managed-node3 10215 1727204057.99527: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 10215 1727204057.99532: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204057.99536: getting variables 10215 1727204057.99539: in VariableManager get_vars() 10215 1727204057.99586: Calling all_inventory to load vars for managed-node3 10215 1727204057.99758: Calling groups_inventory to load vars for managed-node3 10215 1727204057.99763: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204057.99771: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000447 10215 1727204057.99774: WORKER PROCESS EXITING 10215 1727204057.99785: Calling all_plugins_play to load vars for managed-node3 10215 1727204057.99792: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204057.99797: Calling groups_plugins_play to load vars for managed-node3 10215 1727204058.01913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204058.04816: done with get_vars() 10215 1727204058.04857: done getting variables 10215 1727204058.04932: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204058.05064: variable 'profile' from source: include params 10215 1727204058.05068: variable 'item' from source: include params 10215 1727204058.05142: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.083) 0:00:26.618 ***** 10215 1727204058.05179: entering _queue_task() for managed-node3/command 10215 1727204058.05553: worker is 1 (out of 1 available) 10215 1727204058.05568: exiting _queue_task() for managed-node3/command 10215 1727204058.05581: done queuing things up, now waiting for results queue to drain 10215 1727204058.05583: waiting for pending results... 10215 1727204058.05922: running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 10215 1727204058.06044: in run() - task 12b410aa-8751-3c74-8f8e-000000000448 10215 1727204058.06125: variable 'ansible_search_path' from source: unknown 10215 1727204058.06129: variable 'ansible_search_path' from source: unknown 10215 1727204058.06132: calling self._execute() 10215 1727204058.06237: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.06253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.06274: variable 'omit' from source: magic vars 10215 1727204058.06720: variable 'ansible_distribution_major_version' from source: facts 10215 1727204058.06739: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204058.06902: variable 'profile_stat' from source: set_fact 10215 1727204058.06924: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204058.06995: when evaluation is False, skipping this task 10215 1727204058.06998: _execute() done 10215 1727204058.07001: dumping result to json 10215 1727204058.07003: done dumping result, returning 10215 1727204058.07005: done running TaskExecutor() for managed-node3/TASK: Get the fingerprint comment in ifcfg-bond0.1 [12b410aa-8751-3c74-8f8e-000000000448] 10215 1727204058.07007: sending task result for task 12b410aa-8751-3c74-8f8e-000000000448 10215 1727204058.07073: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000448 10215 1727204058.07076: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204058.07162: no more pending results, returning what we have 10215 1727204058.07166: results queue empty 10215 1727204058.07168: checking for any_errors_fatal 10215 1727204058.07175: done checking for any_errors_fatal 10215 1727204058.07176: checking for max_fail_percentage 10215 1727204058.07178: done checking for max_fail_percentage 10215 1727204058.07179: checking to see if all hosts have failed and the running result is not ok 10215 1727204058.07180: done checking to see if all hosts have failed 10215 1727204058.07181: getting the remaining hosts for this loop 10215 1727204058.07183: done getting the remaining hosts for this loop 10215 1727204058.07188: getting the next task for host managed-node3 10215 1727204058.07198: done getting next task for host managed-node3 10215 1727204058.07201: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 10215 1727204058.07205: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204058.07211: getting variables 10215 1727204058.07213: in VariableManager get_vars() 10215 1727204058.07259: Calling all_inventory to load vars for managed-node3 10215 1727204058.07262: Calling groups_inventory to load vars for managed-node3 10215 1727204058.07265: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204058.07282: Calling all_plugins_play to load vars for managed-node3 10215 1727204058.07285: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204058.07564: Calling groups_plugins_play to load vars for managed-node3 10215 1727204058.09759: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204058.14211: done with get_vars() 10215 1727204058.14264: done getting variables 10215 1727204058.14353: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204058.14499: variable 'profile' from source: include params 10215 1727204058.14504: variable 'item' from source: include params 10215 1727204058.14585: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.094) 0:00:26.713 ***** 10215 1727204058.14630: entering _queue_task() for managed-node3/set_fact 10215 1727204058.15019: worker is 1 (out of 1 available) 10215 1727204058.15034: exiting _queue_task() for managed-node3/set_fact 10215 1727204058.15049: done queuing things up, now waiting for results queue to drain 10215 1727204058.15051: waiting for pending results... 10215 1727204058.15431: running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 10215 1727204058.15558: in run() - task 12b410aa-8751-3c74-8f8e-000000000449 10215 1727204058.15581: variable 'ansible_search_path' from source: unknown 10215 1727204058.15592: variable 'ansible_search_path' from source: unknown 10215 1727204058.15635: calling self._execute() 10215 1727204058.16098: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.16102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.16105: variable 'omit' from source: magic vars 10215 1727204058.16851: variable 'ansible_distribution_major_version' from source: facts 10215 1727204058.16992: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204058.17530: variable 'profile_stat' from source: set_fact 10215 1727204058.17548: Evaluated conditional (profile_stat.stat.exists): False 10215 1727204058.17561: when evaluation is False, skipping this task 10215 1727204058.17564: _execute() done 10215 1727204058.17567: dumping result to json 10215 1727204058.17572: done dumping result, returning 10215 1727204058.17581: done running TaskExecutor() for managed-node3/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [12b410aa-8751-3c74-8f8e-000000000449] 10215 1727204058.17588: sending task result for task 12b410aa-8751-3c74-8f8e-000000000449 skipping: [managed-node3] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 10215 1727204058.17749: no more pending results, returning what we have 10215 1727204058.17754: results queue empty 10215 1727204058.17756: checking for any_errors_fatal 10215 1727204058.17768: done checking for any_errors_fatal 10215 1727204058.17769: checking for max_fail_percentage 10215 1727204058.17771: done checking for max_fail_percentage 10215 1727204058.17773: checking to see if all hosts have failed and the running result is not ok 10215 1727204058.17774: done checking to see if all hosts have failed 10215 1727204058.17775: getting the remaining hosts for this loop 10215 1727204058.17777: done getting the remaining hosts for this loop 10215 1727204058.17783: getting the next task for host managed-node3 10215 1727204058.17795: done getting next task for host managed-node3 10215 1727204058.17798: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 10215 1727204058.17801: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204058.17809: getting variables 10215 1727204058.17811: in VariableManager get_vars() 10215 1727204058.17862: Calling all_inventory to load vars for managed-node3 10215 1727204058.17865: Calling groups_inventory to load vars for managed-node3 10215 1727204058.17868: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204058.17884: Calling all_plugins_play to load vars for managed-node3 10215 1727204058.17888: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204058.18167: Calling groups_plugins_play to load vars for managed-node3 10215 1727204058.18805: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000449 10215 1727204058.18809: WORKER PROCESS EXITING 10215 1727204058.22027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204058.26256: done with get_vars() 10215 1727204058.26316: done getting variables 10215 1727204058.26395: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204058.26541: variable 'profile' from source: include params 10215 1727204058.26546: variable 'item' from source: include params 10215 1727204058.26622: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.120) 0:00:26.833 ***** 10215 1727204058.26661: entering _queue_task() for managed-node3/assert 10215 1727204058.27058: worker is 1 (out of 1 available) 10215 1727204058.27072: exiting _queue_task() for managed-node3/assert 10215 1727204058.27086: done queuing things up, now waiting for results queue to drain 10215 1727204058.27088: waiting for pending results... 10215 1727204058.27413: running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0.1' 10215 1727204058.27534: in run() - task 12b410aa-8751-3c74-8f8e-00000000026e 10215 1727204058.27546: variable 'ansible_search_path' from source: unknown 10215 1727204058.27551: variable 'ansible_search_path' from source: unknown 10215 1727204058.27595: calling self._execute() 10215 1727204058.27708: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.27724: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.27737: variable 'omit' from source: magic vars 10215 1727204058.28181: variable 'ansible_distribution_major_version' from source: facts 10215 1727204058.28196: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204058.28202: variable 'omit' from source: magic vars 10215 1727204058.28250: variable 'omit' from source: magic vars 10215 1727204058.28380: variable 'profile' from source: include params 10215 1727204058.28384: variable 'item' from source: include params 10215 1727204058.28494: variable 'item' from source: include params 10215 1727204058.28498: variable 'omit' from source: magic vars 10215 1727204058.28534: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204058.28574: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204058.28603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204058.28672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204058.28676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204058.28680: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204058.28682: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.28685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.28808: Set connection var ansible_connection to ssh 10215 1727204058.28819: Set connection var ansible_pipelining to False 10215 1727204058.28826: Set connection var ansible_shell_type to sh 10215 1727204058.28838: Set connection var ansible_timeout to 10 10215 1727204058.28841: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204058.28888: Set connection var ansible_shell_executable to /bin/sh 10215 1727204058.28891: variable 'ansible_shell_executable' from source: unknown 10215 1727204058.28896: variable 'ansible_connection' from source: unknown 10215 1727204058.28898: variable 'ansible_module_compression' from source: unknown 10215 1727204058.28900: variable 'ansible_shell_type' from source: unknown 10215 1727204058.28907: variable 'ansible_shell_executable' from source: unknown 10215 1727204058.28910: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.28912: variable 'ansible_pipelining' from source: unknown 10215 1727204058.28914: variable 'ansible_timeout' from source: unknown 10215 1727204058.28916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.29108: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204058.29112: variable 'omit' from source: magic vars 10215 1727204058.29115: starting attempt loop 10215 1727204058.29117: running the handler 10215 1727204058.29247: variable 'lsr_net_profile_exists' from source: set_fact 10215 1727204058.29253: Evaluated conditional (lsr_net_profile_exists): True 10215 1727204058.29277: handler run complete 10215 1727204058.29280: attempt loop complete, returning result 10215 1727204058.29285: _execute() done 10215 1727204058.29288: dumping result to json 10215 1727204058.29325: done dumping result, returning 10215 1727204058.29329: done running TaskExecutor() for managed-node3/TASK: Assert that the profile is present - 'bond0.1' [12b410aa-8751-3c74-8f8e-00000000026e] 10215 1727204058.29331: sending task result for task 12b410aa-8751-3c74-8f8e-00000000026e 10215 1727204058.29495: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000026e ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10215 1727204058.29561: no more pending results, returning what we have 10215 1727204058.29566: results queue empty 10215 1727204058.29567: checking for any_errors_fatal 10215 1727204058.29577: done checking for any_errors_fatal 10215 1727204058.29578: checking for max_fail_percentage 10215 1727204058.29580: done checking for max_fail_percentage 10215 1727204058.29582: checking to see if all hosts have failed and the running result is not ok 10215 1727204058.29583: done checking to see if all hosts have failed 10215 1727204058.29584: getting the remaining hosts for this loop 10215 1727204058.29586: done getting the remaining hosts for this loop 10215 1727204058.29595: getting the next task for host managed-node3 10215 1727204058.29603: done getting next task for host managed-node3 10215 1727204058.29609: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 10215 1727204058.29612: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204058.29618: getting variables 10215 1727204058.29620: in VariableManager get_vars() 10215 1727204058.29673: Calling all_inventory to load vars for managed-node3 10215 1727204058.29677: Calling groups_inventory to load vars for managed-node3 10215 1727204058.29680: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204058.29899: Calling all_plugins_play to load vars for managed-node3 10215 1727204058.29904: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204058.29912: Calling groups_plugins_play to load vars for managed-node3 10215 1727204058.30444: WORKER PROCESS EXITING 10215 1727204058.32187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204058.35249: done with get_vars() 10215 1727204058.35300: done getting variables 10215 1727204058.35378: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204058.35568: variable 'profile' from source: include params 10215 1727204058.35572: variable 'item' from source: include params 10215 1727204058.35670: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.090) 0:00:26.924 ***** 10215 1727204058.35715: entering _queue_task() for managed-node3/assert 10215 1727204058.36321: worker is 1 (out of 1 available) 10215 1727204058.36333: exiting _queue_task() for managed-node3/assert 10215 1727204058.36346: done queuing things up, now waiting for results queue to drain 10215 1727204058.36348: waiting for pending results... 10215 1727204058.36643: running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' 10215 1727204058.36797: in run() - task 12b410aa-8751-3c74-8f8e-00000000026f 10215 1727204058.36802: variable 'ansible_search_path' from source: unknown 10215 1727204058.36806: variable 'ansible_search_path' from source: unknown 10215 1727204058.36896: calling self._execute() 10215 1727204058.36968: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.36977: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.36993: variable 'omit' from source: magic vars 10215 1727204058.37536: variable 'ansible_distribution_major_version' from source: facts 10215 1727204058.37551: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204058.37594: variable 'omit' from source: magic vars 10215 1727204058.37620: variable 'omit' from source: magic vars 10215 1727204058.37777: variable 'profile' from source: include params 10215 1727204058.37781: variable 'item' from source: include params 10215 1727204058.37883: variable 'item' from source: include params 10215 1727204058.37994: variable 'omit' from source: magic vars 10215 1727204058.38000: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204058.38075: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204058.38097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204058.38117: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204058.38130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204058.38167: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204058.38170: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.38175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.38287: Set connection var ansible_connection to ssh 10215 1727204058.38602: Set connection var ansible_pipelining to False 10215 1727204058.38619: Set connection var ansible_shell_type to sh 10215 1727204058.38631: Set connection var ansible_timeout to 10 10215 1727204058.38642: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204058.38656: Set connection var ansible_shell_executable to /bin/sh 10215 1727204058.38683: variable 'ansible_shell_executable' from source: unknown 10215 1727204058.38694: variable 'ansible_connection' from source: unknown 10215 1727204058.38702: variable 'ansible_module_compression' from source: unknown 10215 1727204058.38713: variable 'ansible_shell_type' from source: unknown 10215 1727204058.38794: variable 'ansible_shell_executable' from source: unknown 10215 1727204058.38798: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.38800: variable 'ansible_pipelining' from source: unknown 10215 1727204058.38803: variable 'ansible_timeout' from source: unknown 10215 1727204058.38805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.38916: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204058.38935: variable 'omit' from source: magic vars 10215 1727204058.38945: starting attempt loop 10215 1727204058.38953: running the handler 10215 1727204058.39087: variable 'lsr_net_profile_ansible_managed' from source: set_fact 10215 1727204058.39105: Evaluated conditional (lsr_net_profile_ansible_managed): True 10215 1727204058.39122: handler run complete 10215 1727204058.39149: attempt loop complete, returning result 10215 1727204058.39157: _execute() done 10215 1727204058.39165: dumping result to json 10215 1727204058.39172: done dumping result, returning 10215 1727204058.39399: done running TaskExecutor() for managed-node3/TASK: Assert that the ansible managed comment is present in 'bond0.1' [12b410aa-8751-3c74-8f8e-00000000026f] 10215 1727204058.39402: sending task result for task 12b410aa-8751-3c74-8f8e-00000000026f 10215 1727204058.39474: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000026f 10215 1727204058.39478: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10215 1727204058.39529: no more pending results, returning what we have 10215 1727204058.39532: results queue empty 10215 1727204058.39533: checking for any_errors_fatal 10215 1727204058.39540: done checking for any_errors_fatal 10215 1727204058.39541: checking for max_fail_percentage 10215 1727204058.39543: done checking for max_fail_percentage 10215 1727204058.39544: checking to see if all hosts have failed and the running result is not ok 10215 1727204058.39545: done checking to see if all hosts have failed 10215 1727204058.39546: getting the remaining hosts for this loop 10215 1727204058.39547: done getting the remaining hosts for this loop 10215 1727204058.39550: getting the next task for host managed-node3 10215 1727204058.39556: done getting next task for host managed-node3 10215 1727204058.39558: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 10215 1727204058.39561: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204058.39565: getting variables 10215 1727204058.39567: in VariableManager get_vars() 10215 1727204058.39609: Calling all_inventory to load vars for managed-node3 10215 1727204058.39612: Calling groups_inventory to load vars for managed-node3 10215 1727204058.39615: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204058.39626: Calling all_plugins_play to load vars for managed-node3 10215 1727204058.39629: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204058.39633: Calling groups_plugins_play to load vars for managed-node3 10215 1727204058.42030: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204058.44993: done with get_vars() 10215 1727204058.45044: done getting variables 10215 1727204058.45122: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204058.45267: variable 'profile' from source: include params 10215 1727204058.45272: variable 'item' from source: include params 10215 1727204058.45346: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.096) 0:00:27.021 ***** 10215 1727204058.45396: entering _queue_task() for managed-node3/assert 10215 1727204058.45775: worker is 1 (out of 1 available) 10215 1727204058.45897: exiting _queue_task() for managed-node3/assert 10215 1727204058.45910: done queuing things up, now waiting for results queue to drain 10215 1727204058.45912: waiting for pending results... 10215 1727204058.46126: running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0.1 10215 1727204058.46244: in run() - task 12b410aa-8751-3c74-8f8e-000000000270 10215 1727204058.46267: variable 'ansible_search_path' from source: unknown 10215 1727204058.46271: variable 'ansible_search_path' from source: unknown 10215 1727204058.46315: calling self._execute() 10215 1727204058.46437: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.46452: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.46465: variable 'omit' from source: magic vars 10215 1727204058.46913: variable 'ansible_distribution_major_version' from source: facts 10215 1727204058.46925: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204058.46932: variable 'omit' from source: magic vars 10215 1727204058.46993: variable 'omit' from source: magic vars 10215 1727204058.47126: variable 'profile' from source: include params 10215 1727204058.47130: variable 'item' from source: include params 10215 1727204058.47218: variable 'item' from source: include params 10215 1727204058.47238: variable 'omit' from source: magic vars 10215 1727204058.47394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204058.47399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204058.47402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204058.47404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204058.47409: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204058.47432: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204058.47436: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.47442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.47572: Set connection var ansible_connection to ssh 10215 1727204058.47580: Set connection var ansible_pipelining to False 10215 1727204058.47587: Set connection var ansible_shell_type to sh 10215 1727204058.47603: Set connection var ansible_timeout to 10 10215 1727204058.47612: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204058.47622: Set connection var ansible_shell_executable to /bin/sh 10215 1727204058.47654: variable 'ansible_shell_executable' from source: unknown 10215 1727204058.47658: variable 'ansible_connection' from source: unknown 10215 1727204058.47660: variable 'ansible_module_compression' from source: unknown 10215 1727204058.47665: variable 'ansible_shell_type' from source: unknown 10215 1727204058.47668: variable 'ansible_shell_executable' from source: unknown 10215 1727204058.47673: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.47679: variable 'ansible_pipelining' from source: unknown 10215 1727204058.47681: variable 'ansible_timeout' from source: unknown 10215 1727204058.47688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.47864: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204058.47995: variable 'omit' from source: magic vars 10215 1727204058.47999: starting attempt loop 10215 1727204058.48002: running the handler 10215 1727204058.48027: variable 'lsr_net_profile_fingerprint' from source: set_fact 10215 1727204058.48035: Evaluated conditional (lsr_net_profile_fingerprint): True 10215 1727204058.48067: handler run complete 10215 1727204058.48095: attempt loop complete, returning result 10215 1727204058.48099: _execute() done 10215 1727204058.48101: dumping result to json 10215 1727204058.48109: done dumping result, returning 10215 1727204058.48116: done running TaskExecutor() for managed-node3/TASK: Assert that the fingerprint comment is present in bond0.1 [12b410aa-8751-3c74-8f8e-000000000270] 10215 1727204058.48124: sending task result for task 12b410aa-8751-3c74-8f8e-000000000270 10215 1727204058.48225: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000270 10215 1727204058.48229: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false } MSG: All assertions passed 10215 1727204058.48286: no more pending results, returning what we have 10215 1727204058.48295: results queue empty 10215 1727204058.48297: checking for any_errors_fatal 10215 1727204058.48306: done checking for any_errors_fatal 10215 1727204058.48307: checking for max_fail_percentage 10215 1727204058.48309: done checking for max_fail_percentage 10215 1727204058.48310: checking to see if all hosts have failed and the running result is not ok 10215 1727204058.48312: done checking to see if all hosts have failed 10215 1727204058.48313: getting the remaining hosts for this loop 10215 1727204058.48314: done getting the remaining hosts for this loop 10215 1727204058.48319: getting the next task for host managed-node3 10215 1727204058.48328: done getting next task for host managed-node3 10215 1727204058.48331: ^ task is: TASK: ** TEST check polling interval 10215 1727204058.48333: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204058.48337: getting variables 10215 1727204058.48340: in VariableManager get_vars() 10215 1727204058.48387: Calling all_inventory to load vars for managed-node3 10215 1727204058.48596: Calling groups_inventory to load vars for managed-node3 10215 1727204058.48601: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204058.48612: Calling all_plugins_play to load vars for managed-node3 10215 1727204058.48616: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204058.48620: Calling groups_plugins_play to load vars for managed-node3 10215 1727204058.51040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204058.54094: done with get_vars() 10215 1727204058.54143: done getting variables 10215 1727204058.54223: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:75 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.088) 0:00:27.109 ***** 10215 1727204058.54256: entering _queue_task() for managed-node3/command 10215 1727204058.54648: worker is 1 (out of 1 available) 10215 1727204058.54666: exiting _queue_task() for managed-node3/command 10215 1727204058.54679: done queuing things up, now waiting for results queue to drain 10215 1727204058.54681: waiting for pending results... 10215 1727204058.54914: running TaskExecutor() for managed-node3/TASK: ** TEST check polling interval 10215 1727204058.55012: in run() - task 12b410aa-8751-3c74-8f8e-000000000071 10215 1727204058.55023: variable 'ansible_search_path' from source: unknown 10215 1727204058.55070: calling self._execute() 10215 1727204058.55231: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.55235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.55238: variable 'omit' from source: magic vars 10215 1727204058.55653: variable 'ansible_distribution_major_version' from source: facts 10215 1727204058.55665: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204058.55672: variable 'omit' from source: magic vars 10215 1727204058.55774: variable 'omit' from source: magic vars 10215 1727204058.55824: variable 'controller_device' from source: play vars 10215 1727204058.55846: variable 'omit' from source: magic vars 10215 1727204058.55898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204058.55944: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204058.55966: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204058.55989: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204058.56013: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204058.56046: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204058.56050: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.56056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.56182: Set connection var ansible_connection to ssh 10215 1727204058.56191: Set connection var ansible_pipelining to False 10215 1727204058.56205: Set connection var ansible_shell_type to sh 10215 1727204058.56215: Set connection var ansible_timeout to 10 10215 1727204058.56218: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204058.56230: Set connection var ansible_shell_executable to /bin/sh 10215 1727204058.56316: variable 'ansible_shell_executable' from source: unknown 10215 1727204058.56321: variable 'ansible_connection' from source: unknown 10215 1727204058.56324: variable 'ansible_module_compression' from source: unknown 10215 1727204058.56327: variable 'ansible_shell_type' from source: unknown 10215 1727204058.56329: variable 'ansible_shell_executable' from source: unknown 10215 1727204058.56331: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204058.56333: variable 'ansible_pipelining' from source: unknown 10215 1727204058.56335: variable 'ansible_timeout' from source: unknown 10215 1727204058.56337: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204058.56503: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204058.56510: variable 'omit' from source: magic vars 10215 1727204058.56513: starting attempt loop 10215 1727204058.56515: running the handler 10215 1727204058.56518: _low_level_execute_command(): starting 10215 1727204058.56520: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204058.57296: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204058.57325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204058.57411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204058.57433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204058.57445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204058.57466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204058.57542: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204058.59320: stdout chunk (state=3): >>>/root <<< 10215 1727204058.59516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204058.59520: stdout chunk (state=3): >>><<< 10215 1727204058.59523: stderr chunk (state=3): >>><<< 10215 1727204058.59552: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204058.59668: _low_level_execute_command(): starting 10215 1727204058.59673: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181 `" && echo ansible-tmp-1727204058.5955904-11691-9717183807181="` echo /root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181 `" ) && sleep 0' 10215 1727204058.60357: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204058.60407: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204058.60458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204058.60553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204058.60571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204058.60616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204058.60665: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204058.62786: stdout chunk (state=3): >>>ansible-tmp-1727204058.5955904-11691-9717183807181=/root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181 <<< 10215 1727204058.62894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204058.63110: stderr chunk (state=3): >>><<< 10215 1727204058.63114: stdout chunk (state=3): >>><<< 10215 1727204058.63117: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204058.5955904-11691-9717183807181=/root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204058.63121: variable 'ansible_module_compression' from source: unknown 10215 1727204058.63241: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10215 1727204058.63286: variable 'ansible_facts' from source: unknown 10215 1727204058.63421: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181/AnsiballZ_command.py 10215 1727204058.63675: Sending initial data 10215 1727204058.63679: Sent initial data (154 bytes) 10215 1727204058.64637: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204058.64655: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204058.64734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204058.64770: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204058.64821: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204058.64856: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204058.66695: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204058.66833: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204058.66867: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmptc01ztsd /root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181/AnsiballZ_command.py <<< 10215 1727204058.66877: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181/AnsiballZ_command.py" <<< 10215 1727204058.66983: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmptc01ztsd" to remote "/root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181/AnsiballZ_command.py" <<< 10215 1727204058.68987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204058.69015: stderr chunk (state=3): >>><<< 10215 1727204058.69019: stdout chunk (state=3): >>><<< 10215 1727204058.69046: done transferring module to remote 10215 1727204058.69060: _low_level_execute_command(): starting 10215 1727204058.69066: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181/ /root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181/AnsiballZ_command.py && sleep 0' 10215 1727204058.70236: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204058.70246: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204058.70258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204058.70504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204058.70581: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204058.70594: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204058.70657: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204058.72525: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204058.72616: stderr chunk (state=3): >>><<< 10215 1727204058.72943: stdout chunk (state=3): >>><<< 10215 1727204058.72946: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204058.72949: _low_level_execute_command(): starting 10215 1727204058.72952: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181/AnsiballZ_command.py && sleep 0' 10215 1727204058.73999: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204058.74105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204058.74312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204058.74408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204058.74481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204058.92233: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-24 14:54:18.917459", "end": "2024-09-24 14:54:18.921082", "delta": "0:00:00.003623", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10215 1727204058.94199: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204058.94203: stderr chunk (state=3): >>>Shared connection to 10.31.10.90 closed. <<< 10215 1727204058.94206: stderr chunk (state=3): >>><<< 10215 1727204058.94211: stdout chunk (state=3): >>><<< 10215 1727204058.94238: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-24 14:54:18.917459", "end": "2024-09-24 14:54:18.921082", "delta": "0:00:00.003623", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204058.94286: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204058.94501: _low_level_execute_command(): starting 10215 1727204058.94510: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204058.5955904-11691-9717183807181/ > /dev/null 2>&1 && sleep 0' 10215 1727204058.95633: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204058.95911: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10215 1727204058.96117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204058.96185: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204058.98224: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204058.98231: stdout chunk (state=3): >>><<< 10215 1727204058.98234: stderr chunk (state=3): >>><<< 10215 1727204058.98253: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204058.98267: handler run complete 10215 1727204058.98310: Evaluated conditional (False): False 10215 1727204058.98681: variable 'result' from source: unknown 10215 1727204058.98994: Evaluated conditional ('110' in result.stdout): True 10215 1727204058.98997: attempt loop complete, returning result 10215 1727204058.99000: _execute() done 10215 1727204058.99002: dumping result to json 10215 1727204058.99005: done dumping result, returning 10215 1727204058.99011: done running TaskExecutor() for managed-node3/TASK: ** TEST check polling interval [12b410aa-8751-3c74-8f8e-000000000071] 10215 1727204058.99013: sending task result for task 12b410aa-8751-3c74-8f8e-000000000071 ok: [managed-node3] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003623", "end": "2024-09-24 14:54:18.921082", "rc": 0, "start": "2024-09-24 14:54:18.917459" } STDOUT: MII Polling Interval (ms): 110 10215 1727204058.99275: no more pending results, returning what we have 10215 1727204058.99280: results queue empty 10215 1727204058.99281: checking for any_errors_fatal 10215 1727204058.99291: done checking for any_errors_fatal 10215 1727204058.99292: checking for max_fail_percentage 10215 1727204058.99294: done checking for max_fail_percentage 10215 1727204058.99295: checking to see if all hosts have failed and the running result is not ok 10215 1727204058.99296: done checking to see if all hosts have failed 10215 1727204058.99297: getting the remaining hosts for this loop 10215 1727204058.99299: done getting the remaining hosts for this loop 10215 1727204058.99305: getting the next task for host managed-node3 10215 1727204058.99312: done getting next task for host managed-node3 10215 1727204058.99315: ^ task is: TASK: ** TEST check IPv4 10215 1727204058.99317: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204058.99322: getting variables 10215 1727204058.99324: in VariableManager get_vars() 10215 1727204058.99372: Calling all_inventory to load vars for managed-node3 10215 1727204058.99376: Calling groups_inventory to load vars for managed-node3 10215 1727204058.99379: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204058.99509: Calling all_plugins_play to load vars for managed-node3 10215 1727204058.99515: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204058.99520: Calling groups_plugins_play to load vars for managed-node3 10215 1727204059.00196: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000071 10215 1727204059.00202: WORKER PROCESS EXITING 10215 1727204059.03842: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204059.09832: done with get_vars() 10215 1727204059.09893: done getting variables 10215 1727204059.09971: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:80 Tuesday 24 September 2024 14:54:19 -0400 (0:00:00.558) 0:00:27.668 ***** 10215 1727204059.10114: entering _queue_task() for managed-node3/command 10215 1727204059.10993: worker is 1 (out of 1 available) 10215 1727204059.11006: exiting _queue_task() for managed-node3/command 10215 1727204059.11019: done queuing things up, now waiting for results queue to drain 10215 1727204059.11021: waiting for pending results... 10215 1727204059.11158: running TaskExecutor() for managed-node3/TASK: ** TEST check IPv4 10215 1727204059.11328: in run() - task 12b410aa-8751-3c74-8f8e-000000000072 10215 1727204059.11332: variable 'ansible_search_path' from source: unknown 10215 1727204059.11341: calling self._execute() 10215 1727204059.11469: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204059.11483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204059.11499: variable 'omit' from source: magic vars 10215 1727204059.12015: variable 'ansible_distribution_major_version' from source: facts 10215 1727204059.12019: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204059.12022: variable 'omit' from source: magic vars 10215 1727204059.12030: variable 'omit' from source: magic vars 10215 1727204059.12142: variable 'controller_device' from source: play vars 10215 1727204059.12164: variable 'omit' from source: magic vars 10215 1727204059.12216: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204059.12260: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204059.12283: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204059.12310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204059.12324: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204059.12549: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204059.12553: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204059.12555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204059.12558: Set connection var ansible_connection to ssh 10215 1727204059.12560: Set connection var ansible_pipelining to False 10215 1727204059.12563: Set connection var ansible_shell_type to sh 10215 1727204059.12565: Set connection var ansible_timeout to 10 10215 1727204059.12568: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204059.12574: Set connection var ansible_shell_executable to /bin/sh 10215 1727204059.12576: variable 'ansible_shell_executable' from source: unknown 10215 1727204059.12578: variable 'ansible_connection' from source: unknown 10215 1727204059.12583: variable 'ansible_module_compression' from source: unknown 10215 1727204059.12588: variable 'ansible_shell_type' from source: unknown 10215 1727204059.12593: variable 'ansible_shell_executable' from source: unknown 10215 1727204059.12598: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204059.12604: variable 'ansible_pipelining' from source: unknown 10215 1727204059.12610: variable 'ansible_timeout' from source: unknown 10215 1727204059.12613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204059.12795: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204059.12812: variable 'omit' from source: magic vars 10215 1727204059.12818: starting attempt loop 10215 1727204059.12821: running the handler 10215 1727204059.12842: _low_level_execute_command(): starting 10215 1727204059.12851: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204059.13754: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204059.13757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204059.13760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204059.13763: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204059.13765: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204059.13785: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204059.13813: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204059.13834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204059.13848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204059.13986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204059.15677: stdout chunk (state=3): >>>/root <<< 10215 1727204059.15837: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204059.15887: stderr chunk (state=3): >>><<< 10215 1727204059.15961: stdout chunk (state=3): >>><<< 10215 1727204059.15984: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204059.16085: _low_level_execute_command(): starting 10215 1727204059.16091: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580 `" && echo ansible-tmp-1727204059.1604226-11715-161729439812580="` echo /root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580 `" ) && sleep 0' 10215 1727204059.17510: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204059.17514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204059.17518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10215 1727204059.17527: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204059.17529: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204059.17620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204059.17684: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204059.17753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204059.19785: stdout chunk (state=3): >>>ansible-tmp-1727204059.1604226-11715-161729439812580=/root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580 <<< 10215 1727204059.19919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204059.20119: stderr chunk (state=3): >>><<< 10215 1727204059.20123: stdout chunk (state=3): >>><<< 10215 1727204059.20395: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204059.1604226-11715-161729439812580=/root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204059.20399: variable 'ansible_module_compression' from source: unknown 10215 1727204059.20402: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10215 1727204059.20417: variable 'ansible_facts' from source: unknown 10215 1727204059.20642: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580/AnsiballZ_command.py 10215 1727204059.20915: Sending initial data 10215 1727204059.20919: Sent initial data (156 bytes) 10215 1727204059.22431: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204059.22526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204059.22545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204059.22707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204059.22722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204059.24400: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204059.24468: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204059.24497: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpgqg6izfr /root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580/AnsiballZ_command.py <<< 10215 1727204059.24507: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpgqg6izfr" to remote "/root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580/AnsiballZ_command.py" <<< 10215 1727204059.24511: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580/AnsiballZ_command.py" <<< 10215 1727204059.26718: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204059.27066: stderr chunk (state=3): >>><<< 10215 1727204059.27070: stdout chunk (state=3): >>><<< 10215 1727204059.27102: done transferring module to remote 10215 1727204059.27116: _low_level_execute_command(): starting 10215 1727204059.27122: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580/ /root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580/AnsiballZ_command.py && sleep 0' 10215 1727204059.28568: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204059.28572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204059.28591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204059.28598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204059.28618: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 10215 1727204059.28631: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204059.28638: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204059.28835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204059.28852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204059.28858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204059.28929: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204059.30867: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204059.30966: stderr chunk (state=3): >>><<< 10215 1727204059.30970: stdout chunk (state=3): >>><<< 10215 1727204059.30993: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204059.30997: _low_level_execute_command(): starting 10215 1727204059.31006: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580/AnsiballZ_command.py && sleep 0' 10215 1727204059.32253: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204059.32311: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204059.32512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204059.32604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204059.32655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204059.50923: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.4/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 235sec preferred_lft 235sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:19.504093", "end": "2024-09-24 14:54:19.508098", "delta": "0:00:00.004005", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10215 1727204059.52760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204059.52808: stderr chunk (state=3): >>><<< 10215 1727204059.52813: stdout chunk (state=3): >>><<< 10215 1727204059.52985: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.4/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 235sec preferred_lft 235sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:19.504093", "end": "2024-09-24 14:54:19.508098", "delta": "0:00:00.004005", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204059.52992: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204059.52996: _low_level_execute_command(): starting 10215 1727204059.52998: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204059.1604226-11715-161729439812580/ > /dev/null 2>&1 && sleep 0' 10215 1727204059.53612: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204059.53628: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204059.53646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204059.53678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204059.53791: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204059.53815: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204059.53835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204059.53859: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204059.53936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204059.56011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204059.56024: stdout chunk (state=3): >>><<< 10215 1727204059.56046: stderr chunk (state=3): >>><<< 10215 1727204059.56069: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204059.56084: handler run complete 10215 1727204059.56195: Evaluated conditional (False): False 10215 1727204059.56344: variable 'result' from source: set_fact 10215 1727204059.56372: Evaluated conditional ('192.0.2' in result.stdout): True 10215 1727204059.56396: attempt loop complete, returning result 10215 1727204059.56404: _execute() done 10215 1727204059.56411: dumping result to json 10215 1727204059.56530: done dumping result, returning 10215 1727204059.56534: done running TaskExecutor() for managed-node3/TASK: ** TEST check IPv4 [12b410aa-8751-3c74-8f8e-000000000072] 10215 1727204059.56536: sending task result for task 12b410aa-8751-3c74-8f8e-000000000072 ok: [managed-node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.004005", "end": "2024-09-24 14:54:19.508098", "rc": 0, "start": "2024-09-24 14:54:19.504093" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.4/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 235sec preferred_lft 235sec 10215 1727204059.56727: no more pending results, returning what we have 10215 1727204059.56731: results queue empty 10215 1727204059.56733: checking for any_errors_fatal 10215 1727204059.56745: done checking for any_errors_fatal 10215 1727204059.56746: checking for max_fail_percentage 10215 1727204059.56748: done checking for max_fail_percentage 10215 1727204059.56749: checking to see if all hosts have failed and the running result is not ok 10215 1727204059.56750: done checking to see if all hosts have failed 10215 1727204059.56751: getting the remaining hosts for this loop 10215 1727204059.56753: done getting the remaining hosts for this loop 10215 1727204059.56758: getting the next task for host managed-node3 10215 1727204059.56765: done getting next task for host managed-node3 10215 1727204059.56768: ^ task is: TASK: ** TEST check IPv6 10215 1727204059.56770: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204059.56774: getting variables 10215 1727204059.56776: in VariableManager get_vars() 10215 1727204059.57033: Calling all_inventory to load vars for managed-node3 10215 1727204059.57037: Calling groups_inventory to load vars for managed-node3 10215 1727204059.57040: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204059.57053: Calling all_plugins_play to load vars for managed-node3 10215 1727204059.57057: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204059.57060: Calling groups_plugins_play to load vars for managed-node3 10215 1727204059.57627: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000072 10215 1727204059.57631: WORKER PROCESS EXITING 10215 1727204059.59700: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204059.62844: done with get_vars() 10215 1727204059.62903: done getting variables 10215 1727204059.62975: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:87 Tuesday 24 September 2024 14:54:19 -0400 (0:00:00.529) 0:00:28.197 ***** 10215 1727204059.63020: entering _queue_task() for managed-node3/command 10215 1727204059.63403: worker is 1 (out of 1 available) 10215 1727204059.63420: exiting _queue_task() for managed-node3/command 10215 1727204059.63433: done queuing things up, now waiting for results queue to drain 10215 1727204059.63435: waiting for pending results... 10215 1727204059.63636: running TaskExecutor() for managed-node3/TASK: ** TEST check IPv6 10215 1727204059.63717: in run() - task 12b410aa-8751-3c74-8f8e-000000000073 10215 1727204059.63730: variable 'ansible_search_path' from source: unknown 10215 1727204059.63764: calling self._execute() 10215 1727204059.63848: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204059.63855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204059.63867: variable 'omit' from source: magic vars 10215 1727204059.64199: variable 'ansible_distribution_major_version' from source: facts 10215 1727204059.64210: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204059.64215: variable 'omit' from source: magic vars 10215 1727204059.64236: variable 'omit' from source: magic vars 10215 1727204059.64318: variable 'controller_device' from source: play vars 10215 1727204059.64336: variable 'omit' from source: magic vars 10215 1727204059.64376: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204059.64411: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204059.64428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204059.64448: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204059.64459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204059.64486: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204059.64492: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204059.64498: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204059.64585: Set connection var ansible_connection to ssh 10215 1727204059.64592: Set connection var ansible_pipelining to False 10215 1727204059.64600: Set connection var ansible_shell_type to sh 10215 1727204059.64609: Set connection var ansible_timeout to 10 10215 1727204059.64614: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204059.64623: Set connection var ansible_shell_executable to /bin/sh 10215 1727204059.64643: variable 'ansible_shell_executable' from source: unknown 10215 1727204059.64646: variable 'ansible_connection' from source: unknown 10215 1727204059.64649: variable 'ansible_module_compression' from source: unknown 10215 1727204059.64653: variable 'ansible_shell_type' from source: unknown 10215 1727204059.64660: variable 'ansible_shell_executable' from source: unknown 10215 1727204059.64664: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204059.64666: variable 'ansible_pipelining' from source: unknown 10215 1727204059.64669: variable 'ansible_timeout' from source: unknown 10215 1727204059.64675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204059.64800: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204059.64812: variable 'omit' from source: magic vars 10215 1727204059.64816: starting attempt loop 10215 1727204059.64820: running the handler 10215 1727204059.64836: _low_level_execute_command(): starting 10215 1727204059.64844: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204059.65633: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204059.65649: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204059.65724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204059.67493: stdout chunk (state=3): >>>/root <<< 10215 1727204059.67604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204059.67664: stderr chunk (state=3): >>><<< 10215 1727204059.67666: stdout chunk (state=3): >>><<< 10215 1727204059.67683: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204059.67773: _low_level_execute_command(): starting 10215 1727204059.67778: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449 `" && echo ansible-tmp-1727204059.6769044-11733-53071690976449="` echo /root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449 `" ) && sleep 0' 10215 1727204059.68149: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204059.68165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204059.68177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204059.68237: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204059.68241: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204059.68299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204059.70342: stdout chunk (state=3): >>>ansible-tmp-1727204059.6769044-11733-53071690976449=/root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449 <<< 10215 1727204059.70459: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204059.70504: stderr chunk (state=3): >>><<< 10215 1727204059.70510: stdout chunk (state=3): >>><<< 10215 1727204059.70532: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204059.6769044-11733-53071690976449=/root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204059.70560: variable 'ansible_module_compression' from source: unknown 10215 1727204059.70605: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10215 1727204059.70645: variable 'ansible_facts' from source: unknown 10215 1727204059.70696: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449/AnsiballZ_command.py 10215 1727204059.70814: Sending initial data 10215 1727204059.70817: Sent initial data (155 bytes) 10215 1727204059.71256: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204059.71260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204059.71263: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204059.71265: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204059.71326: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204059.71329: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204059.71367: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204059.73034: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 10215 1727204059.73049: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204059.73069: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204059.73101: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpakh2k7tc /root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449/AnsiballZ_command.py <<< 10215 1727204059.73113: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449/AnsiballZ_command.py" <<< 10215 1727204059.73136: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpakh2k7tc" to remote "/root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449/AnsiballZ_command.py" <<< 10215 1727204059.73909: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204059.73964: stderr chunk (state=3): >>><<< 10215 1727204059.73968: stdout chunk (state=3): >>><<< 10215 1727204059.73988: done transferring module to remote 10215 1727204059.74003: _low_level_execute_command(): starting 10215 1727204059.74011: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449/ /root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449/AnsiballZ_command.py && sleep 0' 10215 1727204059.74463: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204059.74467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204059.74469: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204059.74472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204059.74474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204059.74516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204059.74535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204059.74569: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204059.76478: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204059.76527: stderr chunk (state=3): >>><<< 10215 1727204059.76530: stdout chunk (state=3): >>><<< 10215 1727204059.76544: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204059.76547: _low_level_execute_command(): starting 10215 1727204059.76553: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449/AnsiballZ_command.py && sleep 0' 10215 1727204059.77006: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204059.77010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204059.77012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 10215 1727204059.77015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204059.77017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204059.77070: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204059.77076: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204059.77118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204059.95036: stdout chunk (state=3): >>> {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::10c/128 scope global dynamic noprefixroute \n valid_lft 234sec preferred_lft 234sec\n inet6 2001:db8::e127:ad8d:73c2:6a06/64 scope global dynamic noprefixroute \n valid_lft 1801sec preferred_lft 1801sec\n inet6 fe80::a288:13c8:da16:d455/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:19.945439", "end": "2024-09-24 14:54:19.949124", "delta": "0:00:00.003685", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10215 1727204059.96816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204059.96821: stdout chunk (state=3): >>><<< 10215 1727204059.96823: stderr chunk (state=3): >>><<< 10215 1727204059.96853: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::10c/128 scope global dynamic noprefixroute \n valid_lft 234sec preferred_lft 234sec\n inet6 2001:db8::e127:ad8d:73c2:6a06/64 scope global dynamic noprefixroute \n valid_lft 1801sec preferred_lft 1801sec\n inet6 fe80::a288:13c8:da16:d455/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-24 14:54:19.945439", "end": "2024-09-24 14:54:19.949124", "delta": "0:00:00.003685", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204059.96932: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204059.96994: _low_level_execute_command(): starting 10215 1727204059.96998: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204059.6769044-11733-53071690976449/ > /dev/null 2>&1 && sleep 0' 10215 1727204059.97670: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204059.97686: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204059.97704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204059.97730: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204059.97863: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204059.97880: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204059.97915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204059.97982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204059.99987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204060.00001: stdout chunk (state=3): >>><<< 10215 1727204060.00016: stderr chunk (state=3): >>><<< 10215 1727204060.00038: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204060.00054: handler run complete 10215 1727204060.00094: Evaluated conditional (False): False 10215 1727204060.00395: variable 'result' from source: set_fact 10215 1727204060.00399: Evaluated conditional ('2001' in result.stdout): True 10215 1727204060.00401: attempt loop complete, returning result 10215 1727204060.00403: _execute() done 10215 1727204060.00405: dumping result to json 10215 1727204060.00410: done dumping result, returning 10215 1727204060.00412: done running TaskExecutor() for managed-node3/TASK: ** TEST check IPv6 [12b410aa-8751-3c74-8f8e-000000000073] 10215 1727204060.00414: sending task result for task 12b410aa-8751-3c74-8f8e-000000000073 ok: [managed-node3] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003685", "end": "2024-09-24 14:54:19.949124", "rc": 0, "start": "2024-09-24 14:54:19.945439" } STDOUT: 18: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::10c/128 scope global dynamic noprefixroute valid_lft 234sec preferred_lft 234sec inet6 2001:db8::e127:ad8d:73c2:6a06/64 scope global dynamic noprefixroute valid_lft 1801sec preferred_lft 1801sec inet6 fe80::a288:13c8:da16:d455/64 scope link noprefixroute valid_lft forever preferred_lft forever 10215 1727204060.00633: no more pending results, returning what we have 10215 1727204060.00637: results queue empty 10215 1727204060.00638: checking for any_errors_fatal 10215 1727204060.00645: done checking for any_errors_fatal 10215 1727204060.00646: checking for max_fail_percentage 10215 1727204060.00648: done checking for max_fail_percentage 10215 1727204060.00649: checking to see if all hosts have failed and the running result is not ok 10215 1727204060.00650: done checking to see if all hosts have failed 10215 1727204060.00651: getting the remaining hosts for this loop 10215 1727204060.00653: done getting the remaining hosts for this loop 10215 1727204060.00658: getting the next task for host managed-node3 10215 1727204060.00669: done getting next task for host managed-node3 10215 1727204060.00674: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10215 1727204060.00678: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204060.00892: getting variables 10215 1727204060.00895: in VariableManager get_vars() 10215 1727204060.00940: Calling all_inventory to load vars for managed-node3 10215 1727204060.00943: Calling groups_inventory to load vars for managed-node3 10215 1727204060.00946: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204060.00958: Calling all_plugins_play to load vars for managed-node3 10215 1727204060.00962: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204060.00967: Calling groups_plugins_play to load vars for managed-node3 10215 1727204060.01488: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000073 10215 1727204060.01495: WORKER PROCESS EXITING 10215 1727204060.03339: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204060.11201: done with get_vars() 10215 1727204060.11251: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.483) 0:00:28.680 ***** 10215 1727204060.11363: entering _queue_task() for managed-node3/include_tasks 10215 1727204060.11757: worker is 1 (out of 1 available) 10215 1727204060.11777: exiting _queue_task() for managed-node3/include_tasks 10215 1727204060.11791: done queuing things up, now waiting for results queue to drain 10215 1727204060.11794: waiting for pending results... 10215 1727204060.12211: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 10215 1727204060.12397: in run() - task 12b410aa-8751-3c74-8f8e-00000000007c 10215 1727204060.12402: variable 'ansible_search_path' from source: unknown 10215 1727204060.12405: variable 'ansible_search_path' from source: unknown 10215 1727204060.12443: calling self._execute() 10215 1727204060.12570: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204060.12586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204060.12611: variable 'omit' from source: magic vars 10215 1727204060.13194: variable 'ansible_distribution_major_version' from source: facts 10215 1727204060.13199: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204060.13204: _execute() done 10215 1727204060.13209: dumping result to json 10215 1727204060.13214: done dumping result, returning 10215 1727204060.13218: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-3c74-8f8e-00000000007c] 10215 1727204060.13220: sending task result for task 12b410aa-8751-3c74-8f8e-00000000007c 10215 1727204060.13311: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000007c 10215 1727204060.13367: no more pending results, returning what we have 10215 1727204060.13375: in VariableManager get_vars() 10215 1727204060.13611: Calling all_inventory to load vars for managed-node3 10215 1727204060.13615: Calling groups_inventory to load vars for managed-node3 10215 1727204060.13618: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204060.13633: Calling all_plugins_play to load vars for managed-node3 10215 1727204060.13637: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204060.13641: Calling groups_plugins_play to load vars for managed-node3 10215 1727204060.14210: WORKER PROCESS EXITING 10215 1727204060.16066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204060.19203: done with get_vars() 10215 1727204060.19244: variable 'ansible_search_path' from source: unknown 10215 1727204060.19246: variable 'ansible_search_path' from source: unknown 10215 1727204060.19303: we have included files to process 10215 1727204060.19304: generating all_blocks data 10215 1727204060.19310: done generating all_blocks data 10215 1727204060.19317: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10215 1727204060.19322: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10215 1727204060.19326: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 10215 1727204060.20088: done processing included file 10215 1727204060.20091: iterating over new_blocks loaded from include file 10215 1727204060.20095: in VariableManager get_vars() 10215 1727204060.20135: done with get_vars() 10215 1727204060.20137: filtering new block on tags 10215 1727204060.20180: done filtering new block on tags 10215 1727204060.20184: in VariableManager get_vars() 10215 1727204060.20227: done with get_vars() 10215 1727204060.20230: filtering new block on tags 10215 1727204060.20290: done filtering new block on tags 10215 1727204060.20294: in VariableManager get_vars() 10215 1727204060.20336: done with get_vars() 10215 1727204060.20339: filtering new block on tags 10215 1727204060.20395: done filtering new block on tags 10215 1727204060.20398: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node3 10215 1727204060.20405: extending task lists for all hosts with included blocks 10215 1727204060.22014: done extending task lists 10215 1727204060.22016: done processing included files 10215 1727204060.22017: results queue empty 10215 1727204060.22022: checking for any_errors_fatal 10215 1727204060.22029: done checking for any_errors_fatal 10215 1727204060.22030: checking for max_fail_percentage 10215 1727204060.22031: done checking for max_fail_percentage 10215 1727204060.22032: checking to see if all hosts have failed and the running result is not ok 10215 1727204060.22033: done checking to see if all hosts have failed 10215 1727204060.22034: getting the remaining hosts for this loop 10215 1727204060.22036: done getting the remaining hosts for this loop 10215 1727204060.22039: getting the next task for host managed-node3 10215 1727204060.22045: done getting next task for host managed-node3 10215 1727204060.22048: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10215 1727204060.22052: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204060.22064: getting variables 10215 1727204060.22066: in VariableManager get_vars() 10215 1727204060.22087: Calling all_inventory to load vars for managed-node3 10215 1727204060.22091: Calling groups_inventory to load vars for managed-node3 10215 1727204060.22094: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204060.22101: Calling all_plugins_play to load vars for managed-node3 10215 1727204060.22105: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204060.22111: Calling groups_plugins_play to load vars for managed-node3 10215 1727204060.24253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204060.27332: done with get_vars() 10215 1727204060.27376: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.161) 0:00:28.842 ***** 10215 1727204060.27485: entering _queue_task() for managed-node3/setup 10215 1727204060.28124: worker is 1 (out of 1 available) 10215 1727204060.28135: exiting _queue_task() for managed-node3/setup 10215 1727204060.28145: done queuing things up, now waiting for results queue to drain 10215 1727204060.28147: waiting for pending results... 10215 1727204060.28242: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 10215 1727204060.28456: in run() - task 12b410aa-8751-3c74-8f8e-000000000491 10215 1727204060.28590: variable 'ansible_search_path' from source: unknown 10215 1727204060.28595: variable 'ansible_search_path' from source: unknown 10215 1727204060.28599: calling self._execute() 10215 1727204060.28650: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204060.28665: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204060.28682: variable 'omit' from source: magic vars 10215 1727204060.29279: variable 'ansible_distribution_major_version' from source: facts 10215 1727204060.29301: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204060.29605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204060.33055: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204060.33159: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204060.33211: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204060.33265: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204060.33305: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204060.33412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204060.33461: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204060.33615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204060.33676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204060.33786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204060.33792: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204060.33817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204060.33854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204060.33918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204060.33943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204060.34154: variable '__network_required_facts' from source: role '' defaults 10215 1727204060.34167: variable 'ansible_facts' from source: unknown 10215 1727204060.35463: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 10215 1727204060.35473: when evaluation is False, skipping this task 10215 1727204060.35481: _execute() done 10215 1727204060.35491: dumping result to json 10215 1727204060.35500: done dumping result, returning 10215 1727204060.35516: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-3c74-8f8e-000000000491] 10215 1727204060.35529: sending task result for task 12b410aa-8751-3c74-8f8e-000000000491 10215 1727204060.35805: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000491 10215 1727204060.35811: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10215 1727204060.35860: no more pending results, returning what we have 10215 1727204060.35865: results queue empty 10215 1727204060.35866: checking for any_errors_fatal 10215 1727204060.35870: done checking for any_errors_fatal 10215 1727204060.35871: checking for max_fail_percentage 10215 1727204060.35873: done checking for max_fail_percentage 10215 1727204060.35874: checking to see if all hosts have failed and the running result is not ok 10215 1727204060.35875: done checking to see if all hosts have failed 10215 1727204060.35876: getting the remaining hosts for this loop 10215 1727204060.35878: done getting the remaining hosts for this loop 10215 1727204060.35883: getting the next task for host managed-node3 10215 1727204060.35898: done getting next task for host managed-node3 10215 1727204060.35903: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 10215 1727204060.35911: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204060.35931: getting variables 10215 1727204060.35933: in VariableManager get_vars() 10215 1727204060.35981: Calling all_inventory to load vars for managed-node3 10215 1727204060.35984: Calling groups_inventory to load vars for managed-node3 10215 1727204060.35987: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204060.36202: Calling all_plugins_play to load vars for managed-node3 10215 1727204060.36209: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204060.36214: Calling groups_plugins_play to load vars for managed-node3 10215 1727204060.38511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204060.42786: done with get_vars() 10215 1727204060.42827: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.154) 0:00:28.996 ***** 10215 1727204060.42965: entering _queue_task() for managed-node3/stat 10215 1727204060.43417: worker is 1 (out of 1 available) 10215 1727204060.43431: exiting _queue_task() for managed-node3/stat 10215 1727204060.43444: done queuing things up, now waiting for results queue to drain 10215 1727204060.43446: waiting for pending results... 10215 1727204060.43812: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree 10215 1727204060.43936: in run() - task 12b410aa-8751-3c74-8f8e-000000000493 10215 1727204060.43994: variable 'ansible_search_path' from source: unknown 10215 1727204060.43998: variable 'ansible_search_path' from source: unknown 10215 1727204060.44015: calling self._execute() 10215 1727204060.44127: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204060.44144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204060.44164: variable 'omit' from source: magic vars 10215 1727204060.44627: variable 'ansible_distribution_major_version' from source: facts 10215 1727204060.44688: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204060.44896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204060.45256: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204060.45320: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204060.45374: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204060.45426: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204060.45594: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204060.45895: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204060.45898: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204060.45901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204060.45964: variable '__network_is_ostree' from source: set_fact 10215 1727204060.45978: Evaluated conditional (not __network_is_ostree is defined): False 10215 1727204060.45987: when evaluation is False, skipping this task 10215 1727204060.45998: _execute() done 10215 1727204060.46010: dumping result to json 10215 1727204060.46025: done dumping result, returning 10215 1727204060.46039: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-3c74-8f8e-000000000493] 10215 1727204060.46050: sending task result for task 12b410aa-8751-3c74-8f8e-000000000493 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10215 1727204060.46228: no more pending results, returning what we have 10215 1727204060.46233: results queue empty 10215 1727204060.46234: checking for any_errors_fatal 10215 1727204060.46241: done checking for any_errors_fatal 10215 1727204060.46242: checking for max_fail_percentage 10215 1727204060.46244: done checking for max_fail_percentage 10215 1727204060.46246: checking to see if all hosts have failed and the running result is not ok 10215 1727204060.46247: done checking to see if all hosts have failed 10215 1727204060.46248: getting the remaining hosts for this loop 10215 1727204060.46250: done getting the remaining hosts for this loop 10215 1727204060.46255: getting the next task for host managed-node3 10215 1727204060.46265: done getting next task for host managed-node3 10215 1727204060.46268: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10215 1727204060.46275: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204060.46398: getting variables 10215 1727204060.46401: in VariableManager get_vars() 10215 1727204060.46453: Calling all_inventory to load vars for managed-node3 10215 1727204060.46457: Calling groups_inventory to load vars for managed-node3 10215 1727204060.46461: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204060.46702: Calling all_plugins_play to load vars for managed-node3 10215 1727204060.46709: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204060.46714: Calling groups_plugins_play to load vars for managed-node3 10215 1727204060.47406: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000493 10215 1727204060.47413: WORKER PROCESS EXITING 10215 1727204060.48968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204060.53367: done with get_vars() 10215 1727204060.53416: done getting variables 10215 1727204060.53488: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.105) 0:00:29.102 ***** 10215 1727204060.53541: entering _queue_task() for managed-node3/set_fact 10215 1727204060.53926: worker is 1 (out of 1 available) 10215 1727204060.53940: exiting _queue_task() for managed-node3/set_fact 10215 1727204060.53955: done queuing things up, now waiting for results queue to drain 10215 1727204060.53958: waiting for pending results... 10215 1727204060.54322: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 10215 1727204060.54770: in run() - task 12b410aa-8751-3c74-8f8e-000000000494 10215 1727204060.55097: variable 'ansible_search_path' from source: unknown 10215 1727204060.55101: variable 'ansible_search_path' from source: unknown 10215 1727204060.55104: calling self._execute() 10215 1727204060.55188: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204060.55329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204060.55348: variable 'omit' from source: magic vars 10215 1727204060.56237: variable 'ansible_distribution_major_version' from source: facts 10215 1727204060.56313: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204060.56768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204060.57382: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204060.57451: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204060.57500: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204060.57543: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204060.57699: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204060.57740: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204060.58097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204060.58101: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204060.58257: variable '__network_is_ostree' from source: set_fact 10215 1727204060.58270: Evaluated conditional (not __network_is_ostree is defined): False 10215 1727204060.58279: when evaluation is False, skipping this task 10215 1727204060.58287: _execute() done 10215 1727204060.58303: dumping result to json 10215 1727204060.58400: done dumping result, returning 10215 1727204060.58420: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-3c74-8f8e-000000000494] 10215 1727204060.58435: sending task result for task 12b410aa-8751-3c74-8f8e-000000000494 skipping: [managed-node3] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 10215 1727204060.58669: no more pending results, returning what we have 10215 1727204060.58674: results queue empty 10215 1727204060.58676: checking for any_errors_fatal 10215 1727204060.58683: done checking for any_errors_fatal 10215 1727204060.58684: checking for max_fail_percentage 10215 1727204060.58685: done checking for max_fail_percentage 10215 1727204060.58687: checking to see if all hosts have failed and the running result is not ok 10215 1727204060.58688: done checking to see if all hosts have failed 10215 1727204060.58690: getting the remaining hosts for this loop 10215 1727204060.58693: done getting the remaining hosts for this loop 10215 1727204060.58698: getting the next task for host managed-node3 10215 1727204060.58713: done getting next task for host managed-node3 10215 1727204060.58718: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 10215 1727204060.58724: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204060.58748: getting variables 10215 1727204060.58751: in VariableManager get_vars() 10215 1727204060.58802: Calling all_inventory to load vars for managed-node3 10215 1727204060.58806: Calling groups_inventory to load vars for managed-node3 10215 1727204060.58811: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204060.58903: Calling all_plugins_play to load vars for managed-node3 10215 1727204060.58911: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204060.58915: Calling groups_plugins_play to load vars for managed-node3 10215 1727204060.59709: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000494 10215 1727204060.59713: WORKER PROCESS EXITING 10215 1727204060.61521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204060.65084: done with get_vars() 10215 1727204060.65124: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.117) 0:00:29.219 ***** 10215 1727204060.65250: entering _queue_task() for managed-node3/service_facts 10215 1727204060.65606: worker is 1 (out of 1 available) 10215 1727204060.65620: exiting _queue_task() for managed-node3/service_facts 10215 1727204060.65634: done queuing things up, now waiting for results queue to drain 10215 1727204060.65636: waiting for pending results... 10215 1727204060.65956: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running 10215 1727204060.66165: in run() - task 12b410aa-8751-3c74-8f8e-000000000496 10215 1727204060.66188: variable 'ansible_search_path' from source: unknown 10215 1727204060.66200: variable 'ansible_search_path' from source: unknown 10215 1727204060.66250: calling self._execute() 10215 1727204060.66371: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204060.66386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204060.66406: variable 'omit' from source: magic vars 10215 1727204060.66860: variable 'ansible_distribution_major_version' from source: facts 10215 1727204060.66885: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204060.66902: variable 'omit' from source: magic vars 10215 1727204060.67018: variable 'omit' from source: magic vars 10215 1727204060.67067: variable 'omit' from source: magic vars 10215 1727204060.67123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204060.67170: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204060.67213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204060.67239: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204060.67258: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204060.67300: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204060.67312: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204060.67323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204060.67453: Set connection var ansible_connection to ssh 10215 1727204060.67466: Set connection var ansible_pipelining to False 10215 1727204060.67479: Set connection var ansible_shell_type to sh 10215 1727204060.67494: Set connection var ansible_timeout to 10 10215 1727204060.67507: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204060.67524: Set connection var ansible_shell_executable to /bin/sh 10215 1727204060.67558: variable 'ansible_shell_executable' from source: unknown 10215 1727204060.67643: variable 'ansible_connection' from source: unknown 10215 1727204060.67647: variable 'ansible_module_compression' from source: unknown 10215 1727204060.67649: variable 'ansible_shell_type' from source: unknown 10215 1727204060.67651: variable 'ansible_shell_executable' from source: unknown 10215 1727204060.67654: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204060.67656: variable 'ansible_pipelining' from source: unknown 10215 1727204060.67658: variable 'ansible_timeout' from source: unknown 10215 1727204060.67660: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204060.67837: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204060.67854: variable 'omit' from source: magic vars 10215 1727204060.67864: starting attempt loop 10215 1727204060.67870: running the handler 10215 1727204060.67895: _low_level_execute_command(): starting 10215 1727204060.67908: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204060.68719: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204060.68796: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204060.68821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204060.68824: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204060.68900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204060.70647: stdout chunk (state=3): >>>/root <<< 10215 1727204060.70895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204060.70899: stdout chunk (state=3): >>><<< 10215 1727204060.70901: stderr chunk (state=3): >>><<< 10215 1727204060.70904: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204060.70906: _low_level_execute_command(): starting 10215 1727204060.70912: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670 `" && echo ansible-tmp-1727204060.7085671-11769-243246552716670="` echo /root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670 `" ) && sleep 0' 10215 1727204060.71554: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204060.71696: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204060.71702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204060.71789: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204060.71799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204060.73842: stdout chunk (state=3): >>>ansible-tmp-1727204060.7085671-11769-243246552716670=/root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670 <<< 10215 1727204060.74016: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204060.74019: stderr chunk (state=3): >>><<< 10215 1727204060.74022: stdout chunk (state=3): >>><<< 10215 1727204060.74195: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204060.7085671-11769-243246552716670=/root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204060.74198: variable 'ansible_module_compression' from source: unknown 10215 1727204060.74201: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 10215 1727204060.74203: variable 'ansible_facts' from source: unknown 10215 1727204060.74288: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670/AnsiballZ_service_facts.py 10215 1727204060.74457: Sending initial data 10215 1727204060.74460: Sent initial data (162 bytes) 10215 1727204060.75205: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204060.75240: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204060.75253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204060.75263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204060.75329: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204060.77012: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 10215 1727204060.77016: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 10215 1727204060.77018: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 10215 1727204060.77022: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204060.77078: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204060.77134: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpy_3ffjsv /root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670/AnsiballZ_service_facts.py <<< 10215 1727204060.77138: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670/AnsiballZ_service_facts.py" <<< 10215 1727204060.77154: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpy_3ffjsv" to remote "/root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670/AnsiballZ_service_facts.py" <<< 10215 1727204060.78380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204060.78496: stderr chunk (state=3): >>><<< 10215 1727204060.78499: stdout chunk (state=3): >>><<< 10215 1727204060.78509: done transferring module to remote 10215 1727204060.78527: _low_level_execute_command(): starting 10215 1727204060.78565: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670/ /root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670/AnsiballZ_service_facts.py && sleep 0' 10215 1727204060.79195: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204060.79207: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204060.79226: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204060.79252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204060.79280: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204060.79369: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204060.79372: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204060.79434: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204060.79460: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204060.81371: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204060.81427: stderr chunk (state=3): >>><<< 10215 1727204060.81431: stdout chunk (state=3): >>><<< 10215 1727204060.81445: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204060.81448: _low_level_execute_command(): starting 10215 1727204060.81455: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670/AnsiballZ_service_facts.py && sleep 0' 10215 1727204060.81875: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204060.81913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204060.81916: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204060.81919: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10215 1727204060.81922: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204060.81926: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204060.81978: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204060.81985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204060.82029: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204062.71932: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 10215 1727204062.72002: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.servi<<< 10215 1727204062.72071: stdout chunk (state=3): >>>ce", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 10215 1727204062.73626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204062.73686: stderr chunk (state=3): >>><<< 10215 1727204062.73690: stdout chunk (state=3): >>><<< 10215 1727204062.73715: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204062.74681: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204062.74719: _low_level_execute_command(): starting 10215 1727204062.74727: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204060.7085671-11769-243246552716670/ > /dev/null 2>&1 && sleep 0' 10215 1727204062.75474: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204062.75498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204062.75502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 10215 1727204062.75584: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204062.75590: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204062.75593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204062.75596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204062.75638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204062.75645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204062.75713: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204062.77643: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204062.77693: stderr chunk (state=3): >>><<< 10215 1727204062.77697: stdout chunk (state=3): >>><<< 10215 1727204062.77712: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204062.77718: handler run complete 10215 1727204062.77895: variable 'ansible_facts' from source: unknown 10215 1727204062.78064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204062.78944: variable 'ansible_facts' from source: unknown 10215 1727204062.79062: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204062.79345: attempt loop complete, returning result 10215 1727204062.79354: _execute() done 10215 1727204062.79357: dumping result to json 10215 1727204062.79436: done dumping result, returning 10215 1727204062.79448: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-3c74-8f8e-000000000496] 10215 1727204062.79456: sending task result for task 12b410aa-8751-3c74-8f8e-000000000496 10215 1727204062.80804: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000496 10215 1727204062.80810: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10215 1727204062.80876: no more pending results, returning what we have 10215 1727204062.80878: results queue empty 10215 1727204062.80879: checking for any_errors_fatal 10215 1727204062.80882: done checking for any_errors_fatal 10215 1727204062.80882: checking for max_fail_percentage 10215 1727204062.80883: done checking for max_fail_percentage 10215 1727204062.80884: checking to see if all hosts have failed and the running result is not ok 10215 1727204062.80885: done checking to see if all hosts have failed 10215 1727204062.80885: getting the remaining hosts for this loop 10215 1727204062.80886: done getting the remaining hosts for this loop 10215 1727204062.80892: getting the next task for host managed-node3 10215 1727204062.80897: done getting next task for host managed-node3 10215 1727204062.80900: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 10215 1727204062.80904: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204062.80912: getting variables 10215 1727204062.80914: in VariableManager get_vars() 10215 1727204062.80943: Calling all_inventory to load vars for managed-node3 10215 1727204062.80946: Calling groups_inventory to load vars for managed-node3 10215 1727204062.80947: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204062.80956: Calling all_plugins_play to load vars for managed-node3 10215 1727204062.80958: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204062.80960: Calling groups_plugins_play to load vars for managed-node3 10215 1727204062.82099: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204062.84231: done with get_vars() 10215 1727204062.84258: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:54:22 -0400 (0:00:02.190) 0:00:31.410 ***** 10215 1727204062.84348: entering _queue_task() for managed-node3/package_facts 10215 1727204062.84584: worker is 1 (out of 1 available) 10215 1727204062.84599: exiting _queue_task() for managed-node3/package_facts 10215 1727204062.84611: done queuing things up, now waiting for results queue to drain 10215 1727204062.84612: waiting for pending results... 10215 1727204062.84812: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed 10215 1727204062.84944: in run() - task 12b410aa-8751-3c74-8f8e-000000000497 10215 1727204062.84959: variable 'ansible_search_path' from source: unknown 10215 1727204062.84964: variable 'ansible_search_path' from source: unknown 10215 1727204062.84995: calling self._execute() 10215 1727204062.85078: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204062.85084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204062.85098: variable 'omit' from source: magic vars 10215 1727204062.85423: variable 'ansible_distribution_major_version' from source: facts 10215 1727204062.85434: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204062.85441: variable 'omit' from source: magic vars 10215 1727204062.85508: variable 'omit' from source: magic vars 10215 1727204062.85540: variable 'omit' from source: magic vars 10215 1727204062.85574: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204062.85607: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204062.85630: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204062.85646: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204062.85657: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204062.85684: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204062.85688: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204062.85695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204062.85780: Set connection var ansible_connection to ssh 10215 1727204062.85786: Set connection var ansible_pipelining to False 10215 1727204062.85795: Set connection var ansible_shell_type to sh 10215 1727204062.85802: Set connection var ansible_timeout to 10 10215 1727204062.85808: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204062.85820: Set connection var ansible_shell_executable to /bin/sh 10215 1727204062.85842: variable 'ansible_shell_executable' from source: unknown 10215 1727204062.85845: variable 'ansible_connection' from source: unknown 10215 1727204062.85848: variable 'ansible_module_compression' from source: unknown 10215 1727204062.85853: variable 'ansible_shell_type' from source: unknown 10215 1727204062.85856: variable 'ansible_shell_executable' from source: unknown 10215 1727204062.85861: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204062.85866: variable 'ansible_pipelining' from source: unknown 10215 1727204062.85870: variable 'ansible_timeout' from source: unknown 10215 1727204062.85875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204062.86046: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204062.86055: variable 'omit' from source: magic vars 10215 1727204062.86065: starting attempt loop 10215 1727204062.86068: running the handler 10215 1727204062.86081: _low_level_execute_command(): starting 10215 1727204062.86088: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204062.86614: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204062.86620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 10215 1727204062.86624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204062.86674: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204062.86678: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204062.86726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204062.88455: stdout chunk (state=3): >>>/root <<< 10215 1727204062.88564: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204062.88615: stderr chunk (state=3): >>><<< 10215 1727204062.88619: stdout chunk (state=3): >>><<< 10215 1727204062.88638: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204062.88649: _low_level_execute_command(): starting 10215 1727204062.88656: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397 `" && echo ansible-tmp-1727204062.8863776-11850-175922151673397="` echo /root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397 `" ) && sleep 0' 10215 1727204062.89111: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204062.89115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204062.89118: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204062.89129: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204062.89132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204062.89176: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204062.89179: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204062.89221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204062.91184: stdout chunk (state=3): >>>ansible-tmp-1727204062.8863776-11850-175922151673397=/root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397 <<< 10215 1727204062.91305: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204062.91359: stderr chunk (state=3): >>><<< 10215 1727204062.91363: stdout chunk (state=3): >>><<< 10215 1727204062.91379: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204062.8863776-11850-175922151673397=/root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204062.91433: variable 'ansible_module_compression' from source: unknown 10215 1727204062.91475: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 10215 1727204062.91540: variable 'ansible_facts' from source: unknown 10215 1727204062.91676: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397/AnsiballZ_package_facts.py 10215 1727204062.91805: Sending initial data 10215 1727204062.91808: Sent initial data (162 bytes) 10215 1727204062.92283: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204062.92286: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204062.92289: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204062.92302: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204062.92350: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204062.92357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204062.92398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204062.93985: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 10215 1727204062.94000: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204062.94024: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204062.94054: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpje5doije /root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397/AnsiballZ_package_facts.py <<< 10215 1727204062.94063: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397/AnsiballZ_package_facts.py" <<< 10215 1727204062.94091: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpje5doije" to remote "/root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397/AnsiballZ_package_facts.py" <<< 10215 1727204062.96286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204062.96350: stderr chunk (state=3): >>><<< 10215 1727204062.96353: stdout chunk (state=3): >>><<< 10215 1727204062.96373: done transferring module to remote 10215 1727204062.96383: _low_level_execute_command(): starting 10215 1727204062.96392: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397/ /root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397/AnsiballZ_package_facts.py && sleep 0' 10215 1727204062.96839: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204062.96842: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204062.96845: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204062.96849: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204062.96851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204062.96907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204062.96910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204062.96951: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204062.99096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204062.99100: stdout chunk (state=3): >>><<< 10215 1727204062.99103: stderr chunk (state=3): >>><<< 10215 1727204062.99105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204062.99107: _low_level_execute_command(): starting 10215 1727204062.99110: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397/AnsiballZ_package_facts.py && sleep 0' 10215 1727204063.00575: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204063.00621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204063.64311: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 10215 1727204063.64510: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 10215 1727204063.64534: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5<<< 10215 1727204063.64542: stdout chunk (state=3): >>>", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 10215 1727204063.64722: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 10215 1727204063.66532: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204063.66624: stderr chunk (state=3): >>><<< 10215 1727204063.66628: stdout chunk (state=3): >>><<< 10215 1727204063.66866: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204063.72265: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204063.72301: _low_level_execute_command(): starting 10215 1727204063.72317: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204062.8863776-11850-175922151673397/ > /dev/null 2>&1 && sleep 0' 10215 1727204063.73886: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204063.73909: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204063.73925: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204063.73997: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204063.76105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204063.76234: stderr chunk (state=3): >>><<< 10215 1727204063.76237: stdout chunk (state=3): >>><<< 10215 1727204063.76240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204063.76242: handler run complete 10215 1727204063.79288: variable 'ansible_facts' from source: unknown 10215 1727204063.81157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204063.86879: variable 'ansible_facts' from source: unknown 10215 1727204063.87637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204063.89104: attempt loop complete, returning result 10215 1727204063.89124: _execute() done 10215 1727204063.89128: dumping result to json 10215 1727204063.89453: done dumping result, returning 10215 1727204063.89471: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-3c74-8f8e-000000000497] 10215 1727204063.89481: sending task result for task 12b410aa-8751-3c74-8f8e-000000000497 10215 1727204063.94015: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000497 10215 1727204063.94019: WORKER PROCESS EXITING ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10215 1727204063.94190: no more pending results, returning what we have 10215 1727204063.94194: results queue empty 10215 1727204063.94195: checking for any_errors_fatal 10215 1727204063.94200: done checking for any_errors_fatal 10215 1727204063.94201: checking for max_fail_percentage 10215 1727204063.94203: done checking for max_fail_percentage 10215 1727204063.94204: checking to see if all hosts have failed and the running result is not ok 10215 1727204063.94205: done checking to see if all hosts have failed 10215 1727204063.94206: getting the remaining hosts for this loop 10215 1727204063.94207: done getting the remaining hosts for this loop 10215 1727204063.94230: getting the next task for host managed-node3 10215 1727204063.94239: done getting next task for host managed-node3 10215 1727204063.94244: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 10215 1727204063.94248: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204063.94263: getting variables 10215 1727204063.94265: in VariableManager get_vars() 10215 1727204063.94431: Calling all_inventory to load vars for managed-node3 10215 1727204063.94435: Calling groups_inventory to load vars for managed-node3 10215 1727204063.94437: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204063.94449: Calling all_plugins_play to load vars for managed-node3 10215 1727204063.94453: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204063.94458: Calling groups_plugins_play to load vars for managed-node3 10215 1727204063.97348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204064.00668: done with get_vars() 10215 1727204064.00718: done getting variables 10215 1727204064.00796: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:24 -0400 (0:00:01.164) 0:00:32.575 ***** 10215 1727204064.00849: entering _queue_task() for managed-node3/debug 10215 1727204064.01320: worker is 1 (out of 1 available) 10215 1727204064.01333: exiting _queue_task() for managed-node3/debug 10215 1727204064.01345: done queuing things up, now waiting for results queue to drain 10215 1727204064.01347: waiting for pending results... 10215 1727204064.01717: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider 10215 1727204064.01858: in run() - task 12b410aa-8751-3c74-8f8e-00000000007d 10215 1727204064.01886: variable 'ansible_search_path' from source: unknown 10215 1727204064.01898: variable 'ansible_search_path' from source: unknown 10215 1727204064.01955: calling self._execute() 10215 1727204064.02077: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204064.02094: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204064.02114: variable 'omit' from source: magic vars 10215 1727204064.02683: variable 'ansible_distribution_major_version' from source: facts 10215 1727204064.02687: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204064.02691: variable 'omit' from source: magic vars 10215 1727204064.02732: variable 'omit' from source: magic vars 10215 1727204064.02997: variable 'network_provider' from source: set_fact 10215 1727204064.03340: variable 'omit' from source: magic vars 10215 1727204064.03597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204064.03600: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204064.03603: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204064.03714: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204064.03739: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204064.03780: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204064.03842: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204064.04100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204064.04121: Set connection var ansible_connection to ssh 10215 1727204064.04138: Set connection var ansible_pipelining to False 10215 1727204064.04152: Set connection var ansible_shell_type to sh 10215 1727204064.04167: Set connection var ansible_timeout to 10 10215 1727204064.04220: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204064.04235: Set connection var ansible_shell_executable to /bin/sh 10215 1727204064.04263: variable 'ansible_shell_executable' from source: unknown 10215 1727204064.04326: variable 'ansible_connection' from source: unknown 10215 1727204064.04335: variable 'ansible_module_compression' from source: unknown 10215 1727204064.04344: variable 'ansible_shell_type' from source: unknown 10215 1727204064.04352: variable 'ansible_shell_executable' from source: unknown 10215 1727204064.04360: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204064.04440: variable 'ansible_pipelining' from source: unknown 10215 1727204064.04453: variable 'ansible_timeout' from source: unknown 10215 1727204064.04463: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204064.04825: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204064.04844: variable 'omit' from source: magic vars 10215 1727204064.04873: starting attempt loop 10215 1727204064.04881: running the handler 10215 1727204064.04992: handler run complete 10215 1727204064.05071: attempt loop complete, returning result 10215 1727204064.05191: _execute() done 10215 1727204064.05196: dumping result to json 10215 1727204064.05199: done dumping result, returning 10215 1727204064.05202: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-3c74-8f8e-00000000007d] 10215 1727204064.05205: sending task result for task 12b410aa-8751-3c74-8f8e-00000000007d ok: [managed-node3] => {} MSG: Using network provider: nm 10215 1727204064.05395: no more pending results, returning what we have 10215 1727204064.05400: results queue empty 10215 1727204064.05402: checking for any_errors_fatal 10215 1727204064.05415: done checking for any_errors_fatal 10215 1727204064.05416: checking for max_fail_percentage 10215 1727204064.05419: done checking for max_fail_percentage 10215 1727204064.05420: checking to see if all hosts have failed and the running result is not ok 10215 1727204064.05421: done checking to see if all hosts have failed 10215 1727204064.05422: getting the remaining hosts for this loop 10215 1727204064.05429: done getting the remaining hosts for this loop 10215 1727204064.05436: getting the next task for host managed-node3 10215 1727204064.05446: done getting next task for host managed-node3 10215 1727204064.05451: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10215 1727204064.05457: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204064.05473: getting variables 10215 1727204064.05475: in VariableManager get_vars() 10215 1727204064.05687: Calling all_inventory to load vars for managed-node3 10215 1727204064.05693: Calling groups_inventory to load vars for managed-node3 10215 1727204064.05697: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204064.05711: Calling all_plugins_play to load vars for managed-node3 10215 1727204064.05716: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204064.05721: Calling groups_plugins_play to load vars for managed-node3 10215 1727204064.06397: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000007d 10215 1727204064.06400: WORKER PROCESS EXITING 10215 1727204064.09478: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204064.14078: done with get_vars() 10215 1727204064.14131: done getting variables 10215 1727204064.14212: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.134) 0:00:32.709 ***** 10215 1727204064.14264: entering _queue_task() for managed-node3/fail 10215 1727204064.14900: worker is 1 (out of 1 available) 10215 1727204064.14915: exiting _queue_task() for managed-node3/fail 10215 1727204064.14927: done queuing things up, now waiting for results queue to drain 10215 1727204064.14929: waiting for pending results... 10215 1727204064.15055: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 10215 1727204064.15399: in run() - task 12b410aa-8751-3c74-8f8e-00000000007e 10215 1727204064.15404: variable 'ansible_search_path' from source: unknown 10215 1727204064.15409: variable 'ansible_search_path' from source: unknown 10215 1727204064.15413: calling self._execute() 10215 1727204064.15471: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204064.15487: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204064.15519: variable 'omit' from source: magic vars 10215 1727204064.15995: variable 'ansible_distribution_major_version' from source: facts 10215 1727204064.16019: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204064.16186: variable 'network_state' from source: role '' defaults 10215 1727204064.16205: Evaluated conditional (network_state != {}): False 10215 1727204064.16216: when evaluation is False, skipping this task 10215 1727204064.16223: _execute() done 10215 1727204064.16230: dumping result to json 10215 1727204064.16237: done dumping result, returning 10215 1727204064.16247: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-3c74-8f8e-00000000007e] 10215 1727204064.16263: sending task result for task 12b410aa-8751-3c74-8f8e-00000000007e skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10215 1727204064.16562: no more pending results, returning what we have 10215 1727204064.16567: results queue empty 10215 1727204064.16569: checking for any_errors_fatal 10215 1727204064.16576: done checking for any_errors_fatal 10215 1727204064.16577: checking for max_fail_percentage 10215 1727204064.16579: done checking for max_fail_percentage 10215 1727204064.16586: checking to see if all hosts have failed and the running result is not ok 10215 1727204064.16588: done checking to see if all hosts have failed 10215 1727204064.16589: getting the remaining hosts for this loop 10215 1727204064.16591: done getting the remaining hosts for this loop 10215 1727204064.16598: getting the next task for host managed-node3 10215 1727204064.16606: done getting next task for host managed-node3 10215 1727204064.16613: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10215 1727204064.16619: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204064.16712: getting variables 10215 1727204064.16715: in VariableManager get_vars() 10215 1727204064.16763: Calling all_inventory to load vars for managed-node3 10215 1727204064.16767: Calling groups_inventory to load vars for managed-node3 10215 1727204064.16770: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204064.16784: Calling all_plugins_play to load vars for managed-node3 10215 1727204064.16788: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204064.16921: Calling groups_plugins_play to load vars for managed-node3 10215 1727204064.17608: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000007e 10215 1727204064.17612: WORKER PROCESS EXITING 10215 1727204064.19342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204064.22439: done with get_vars() 10215 1727204064.22488: done getting variables 10215 1727204064.22571: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.083) 0:00:32.793 ***** 10215 1727204064.22625: entering _queue_task() for managed-node3/fail 10215 1727204064.23133: worker is 1 (out of 1 available) 10215 1727204064.23148: exiting _queue_task() for managed-node3/fail 10215 1727204064.23162: done queuing things up, now waiting for results queue to drain 10215 1727204064.23164: waiting for pending results... 10215 1727204064.23418: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 10215 1727204064.23618: in run() - task 12b410aa-8751-3c74-8f8e-00000000007f 10215 1727204064.23643: variable 'ansible_search_path' from source: unknown 10215 1727204064.23652: variable 'ansible_search_path' from source: unknown 10215 1727204064.23701: calling self._execute() 10215 1727204064.23831: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204064.23847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204064.23865: variable 'omit' from source: magic vars 10215 1727204064.24315: variable 'ansible_distribution_major_version' from source: facts 10215 1727204064.24335: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204064.24509: variable 'network_state' from source: role '' defaults 10215 1727204064.24529: Evaluated conditional (network_state != {}): False 10215 1727204064.24537: when evaluation is False, skipping this task 10215 1727204064.24545: _execute() done 10215 1727204064.24552: dumping result to json 10215 1727204064.24560: done dumping result, returning 10215 1727204064.24572: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-3c74-8f8e-00000000007f] 10215 1727204064.24589: sending task result for task 12b410aa-8751-3c74-8f8e-00000000007f skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10215 1727204064.24866: no more pending results, returning what we have 10215 1727204064.24870: results queue empty 10215 1727204064.24871: checking for any_errors_fatal 10215 1727204064.24880: done checking for any_errors_fatal 10215 1727204064.24881: checking for max_fail_percentage 10215 1727204064.24883: done checking for max_fail_percentage 10215 1727204064.24884: checking to see if all hosts have failed and the running result is not ok 10215 1727204064.24885: done checking to see if all hosts have failed 10215 1727204064.24886: getting the remaining hosts for this loop 10215 1727204064.24888: done getting the remaining hosts for this loop 10215 1727204064.24895: getting the next task for host managed-node3 10215 1727204064.24904: done getting next task for host managed-node3 10215 1727204064.24911: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10215 1727204064.24917: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204064.24941: getting variables 10215 1727204064.24943: in VariableManager get_vars() 10215 1727204064.25105: Calling all_inventory to load vars for managed-node3 10215 1727204064.25111: Calling groups_inventory to load vars for managed-node3 10215 1727204064.25115: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204064.25201: Calling all_plugins_play to load vars for managed-node3 10215 1727204064.25205: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204064.25218: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000007f 10215 1727204064.25222: WORKER PROCESS EXITING 10215 1727204064.25226: Calling groups_plugins_play to load vars for managed-node3 10215 1727204064.27492: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204064.30596: done with get_vars() 10215 1727204064.30649: done getting variables 10215 1727204064.30733: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.081) 0:00:32.874 ***** 10215 1727204064.30783: entering _queue_task() for managed-node3/fail 10215 1727204064.31304: worker is 1 (out of 1 available) 10215 1727204064.31320: exiting _queue_task() for managed-node3/fail 10215 1727204064.31332: done queuing things up, now waiting for results queue to drain 10215 1727204064.31334: waiting for pending results... 10215 1727204064.31588: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 10215 1727204064.31787: in run() - task 12b410aa-8751-3c74-8f8e-000000000080 10215 1727204064.31815: variable 'ansible_search_path' from source: unknown 10215 1727204064.31826: variable 'ansible_search_path' from source: unknown 10215 1727204064.31874: calling self._execute() 10215 1727204064.31988: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204064.32017: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204064.32036: variable 'omit' from source: magic vars 10215 1727204064.32515: variable 'ansible_distribution_major_version' from source: facts 10215 1727204064.32534: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204064.32871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204064.36261: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204064.36350: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204064.36411: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204064.36457: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204064.36503: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204064.36615: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204064.36656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204064.36795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204064.36798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204064.36801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204064.36903: variable 'ansible_distribution_major_version' from source: facts 10215 1727204064.36935: Evaluated conditional (ansible_distribution_major_version | int > 9): True 10215 1727204064.37098: variable 'ansible_distribution' from source: facts 10215 1727204064.37111: variable '__network_rh_distros' from source: role '' defaults 10215 1727204064.37132: Evaluated conditional (ansible_distribution in __network_rh_distros): False 10215 1727204064.37145: when evaluation is False, skipping this task 10215 1727204064.37152: _execute() done 10215 1727204064.37160: dumping result to json 10215 1727204064.37168: done dumping result, returning 10215 1727204064.37180: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-3c74-8f8e-000000000080] 10215 1727204064.37193: sending task result for task 12b410aa-8751-3c74-8f8e-000000000080 10215 1727204064.37324: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000080 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 10215 1727204064.37553: no more pending results, returning what we have 10215 1727204064.37557: results queue empty 10215 1727204064.37558: checking for any_errors_fatal 10215 1727204064.37565: done checking for any_errors_fatal 10215 1727204064.37566: checking for max_fail_percentage 10215 1727204064.37568: done checking for max_fail_percentage 10215 1727204064.37569: checking to see if all hosts have failed and the running result is not ok 10215 1727204064.37570: done checking to see if all hosts have failed 10215 1727204064.37571: getting the remaining hosts for this loop 10215 1727204064.37573: done getting the remaining hosts for this loop 10215 1727204064.37577: getting the next task for host managed-node3 10215 1727204064.37585: done getting next task for host managed-node3 10215 1727204064.37591: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10215 1727204064.37596: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204064.37623: getting variables 10215 1727204064.37625: in VariableManager get_vars() 10215 1727204064.37673: Calling all_inventory to load vars for managed-node3 10215 1727204064.37677: Calling groups_inventory to load vars for managed-node3 10215 1727204064.37680: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204064.37803: Calling all_plugins_play to load vars for managed-node3 10215 1727204064.37812: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204064.37823: WORKER PROCESS EXITING 10215 1727204064.37833: Calling groups_plugins_play to load vars for managed-node3 10215 1727204064.42716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204064.48725: done with get_vars() 10215 1727204064.48773: done getting variables 10215 1727204064.49052: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.183) 0:00:33.058 ***** 10215 1727204064.49095: entering _queue_task() for managed-node3/dnf 10215 1727204064.49926: worker is 1 (out of 1 available) 10215 1727204064.49939: exiting _queue_task() for managed-node3/dnf 10215 1727204064.49952: done queuing things up, now waiting for results queue to drain 10215 1727204064.49953: waiting for pending results... 10215 1727204064.50085: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 10215 1727204064.50279: in run() - task 12b410aa-8751-3c74-8f8e-000000000081 10215 1727204064.50433: variable 'ansible_search_path' from source: unknown 10215 1727204064.50444: variable 'ansible_search_path' from source: unknown 10215 1727204064.50549: calling self._execute() 10215 1727204064.50866: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204064.50870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204064.50901: variable 'omit' from source: magic vars 10215 1727204064.51975: variable 'ansible_distribution_major_version' from source: facts 10215 1727204064.51997: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204064.52710: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204064.57337: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204064.57431: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204064.57478: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204064.57532: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204064.57570: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204064.57675: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204064.57726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204064.57763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204064.57827: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204064.57849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204064.58002: variable 'ansible_distribution' from source: facts 10215 1727204064.58017: variable 'ansible_distribution_major_version' from source: facts 10215 1727204064.58030: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 10215 1727204064.58192: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204064.58388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204064.58426: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204064.58460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204064.58525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204064.58548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204064.58693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204064.58696: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204064.58700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204064.58733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204064.58754: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204064.58814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204064.58847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204064.58883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204064.58945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204064.58968: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204064.59192: variable 'network_connections' from source: task vars 10215 1727204064.59219: variable 'port2_profile' from source: play vars 10215 1727204064.59309: variable 'port2_profile' from source: play vars 10215 1727204064.59328: variable 'port1_profile' from source: play vars 10215 1727204064.59413: variable 'port1_profile' from source: play vars 10215 1727204064.59428: variable 'controller_profile' from source: play vars 10215 1727204064.59559: variable 'controller_profile' from source: play vars 10215 1727204064.59614: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204064.59856: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204064.59916: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204064.59958: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204064.60001: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204064.60062: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204064.60112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204064.60216: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204064.60219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204064.60250: variable '__network_team_connections_defined' from source: role '' defaults 10215 1727204064.60593: variable 'network_connections' from source: task vars 10215 1727204064.60605: variable 'port2_profile' from source: play vars 10215 1727204064.60687: variable 'port2_profile' from source: play vars 10215 1727204064.60703: variable 'port1_profile' from source: play vars 10215 1727204064.60785: variable 'port1_profile' from source: play vars 10215 1727204064.60801: variable 'controller_profile' from source: play vars 10215 1727204064.60883: variable 'controller_profile' from source: play vars 10215 1727204064.60922: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10215 1727204064.60931: when evaluation is False, skipping this task 10215 1727204064.60976: _execute() done 10215 1727204064.60979: dumping result to json 10215 1727204064.60981: done dumping result, returning 10215 1727204064.60984: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-3c74-8f8e-000000000081] 10215 1727204064.60986: sending task result for task 12b410aa-8751-3c74-8f8e-000000000081 skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10215 1727204064.61272: no more pending results, returning what we have 10215 1727204064.61277: results queue empty 10215 1727204064.61278: checking for any_errors_fatal 10215 1727204064.61288: done checking for any_errors_fatal 10215 1727204064.61290: checking for max_fail_percentage 10215 1727204064.61293: done checking for max_fail_percentage 10215 1727204064.61294: checking to see if all hosts have failed and the running result is not ok 10215 1727204064.61295: done checking to see if all hosts have failed 10215 1727204064.61296: getting the remaining hosts for this loop 10215 1727204064.61298: done getting the remaining hosts for this loop 10215 1727204064.61303: getting the next task for host managed-node3 10215 1727204064.61314: done getting next task for host managed-node3 10215 1727204064.61319: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10215 1727204064.61324: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204064.61346: getting variables 10215 1727204064.61348: in VariableManager get_vars() 10215 1727204064.61595: Calling all_inventory to load vars for managed-node3 10215 1727204064.61599: Calling groups_inventory to load vars for managed-node3 10215 1727204064.61602: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204064.61617: Calling all_plugins_play to load vars for managed-node3 10215 1727204064.61621: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204064.61624: Calling groups_plugins_play to load vars for managed-node3 10215 1727204064.62310: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000081 10215 1727204064.62314: WORKER PROCESS EXITING 10215 1727204064.63987: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204064.66976: done with get_vars() 10215 1727204064.67023: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 10215 1727204064.67118: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.180) 0:00:33.238 ***** 10215 1727204064.67158: entering _queue_task() for managed-node3/yum 10215 1727204064.67527: worker is 1 (out of 1 available) 10215 1727204064.67541: exiting _queue_task() for managed-node3/yum 10215 1727204064.67555: done queuing things up, now waiting for results queue to drain 10215 1727204064.67557: waiting for pending results... 10215 1727204064.68087: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 10215 1727204064.68478: in run() - task 12b410aa-8751-3c74-8f8e-000000000082 10215 1727204064.68644: variable 'ansible_search_path' from source: unknown 10215 1727204064.68654: variable 'ansible_search_path' from source: unknown 10215 1727204064.68701: calling self._execute() 10215 1727204064.68879: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204064.68962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204064.68979: variable 'omit' from source: magic vars 10215 1727204064.69872: variable 'ansible_distribution_major_version' from source: facts 10215 1727204064.69951: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204064.70415: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204064.73372: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204064.73463: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204064.73514: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204064.73566: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204064.73604: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204064.73711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204064.73758: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204064.73796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204064.73863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204064.73888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204064.74017: variable 'ansible_distribution_major_version' from source: facts 10215 1727204064.74040: Evaluated conditional (ansible_distribution_major_version | int < 8): False 10215 1727204064.74048: when evaluation is False, skipping this task 10215 1727204064.74055: _execute() done 10215 1727204064.74068: dumping result to json 10215 1727204064.74076: done dumping result, returning 10215 1727204064.74091: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-3c74-8f8e-000000000082] 10215 1727204064.74103: sending task result for task 12b410aa-8751-3c74-8f8e-000000000082 skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 10215 1727204064.74414: no more pending results, returning what we have 10215 1727204064.74419: results queue empty 10215 1727204064.74420: checking for any_errors_fatal 10215 1727204064.74427: done checking for any_errors_fatal 10215 1727204064.74428: checking for max_fail_percentage 10215 1727204064.74430: done checking for max_fail_percentage 10215 1727204064.74431: checking to see if all hosts have failed and the running result is not ok 10215 1727204064.74432: done checking to see if all hosts have failed 10215 1727204064.74433: getting the remaining hosts for this loop 10215 1727204064.74435: done getting the remaining hosts for this loop 10215 1727204064.74440: getting the next task for host managed-node3 10215 1727204064.74449: done getting next task for host managed-node3 10215 1727204064.74454: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10215 1727204064.74459: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204064.74481: getting variables 10215 1727204064.74483: in VariableManager get_vars() 10215 1727204064.74535: Calling all_inventory to load vars for managed-node3 10215 1727204064.74539: Calling groups_inventory to load vars for managed-node3 10215 1727204064.74542: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204064.74556: Calling all_plugins_play to load vars for managed-node3 10215 1727204064.74560: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204064.74564: Calling groups_plugins_play to load vars for managed-node3 10215 1727204064.75109: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000082 10215 1727204064.75113: WORKER PROCESS EXITING 10215 1727204064.76956: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204064.79962: done with get_vars() 10215 1727204064.80011: done getting variables 10215 1727204064.80084: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.129) 0:00:33.368 ***** 10215 1727204064.80135: entering _queue_task() for managed-node3/fail 10215 1727204064.80626: worker is 1 (out of 1 available) 10215 1727204064.80640: exiting _queue_task() for managed-node3/fail 10215 1727204064.80652: done queuing things up, now waiting for results queue to drain 10215 1727204064.80654: waiting for pending results... 10215 1727204064.80883: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 10215 1727204064.81071: in run() - task 12b410aa-8751-3c74-8f8e-000000000083 10215 1727204064.81096: variable 'ansible_search_path' from source: unknown 10215 1727204064.81111: variable 'ansible_search_path' from source: unknown 10215 1727204064.81158: calling self._execute() 10215 1727204064.81269: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204064.81284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204064.81303: variable 'omit' from source: magic vars 10215 1727204064.81772: variable 'ansible_distribution_major_version' from source: facts 10215 1727204064.81795: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204064.81954: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204064.82231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204064.91170: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204064.91262: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204064.91313: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204064.91362: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204064.91398: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204064.91493: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204064.91539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204064.91795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204064.91798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204064.91801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204064.91803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204064.91806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204064.91810: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204064.91834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204064.91858: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204064.91922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204064.91959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204064.91995: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204064.92055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204064.92146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204064.92324: variable 'network_connections' from source: task vars 10215 1727204064.92343: variable 'port2_profile' from source: play vars 10215 1727204064.92436: variable 'port2_profile' from source: play vars 10215 1727204064.92451: variable 'port1_profile' from source: play vars 10215 1727204064.92537: variable 'port1_profile' from source: play vars 10215 1727204064.92552: variable 'controller_profile' from source: play vars 10215 1727204064.92636: variable 'controller_profile' from source: play vars 10215 1727204064.92746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204064.92973: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204064.93033: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204064.93074: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204064.93124: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204064.93233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204064.93237: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204064.93249: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204064.93288: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204064.93354: variable '__network_team_connections_defined' from source: role '' defaults 10215 1727204064.94095: variable 'network_connections' from source: task vars 10215 1727204064.94099: variable 'port2_profile' from source: play vars 10215 1727204064.94324: variable 'port2_profile' from source: play vars 10215 1727204064.94327: variable 'port1_profile' from source: play vars 10215 1727204064.94330: variable 'port1_profile' from source: play vars 10215 1727204064.94332: variable 'controller_profile' from source: play vars 10215 1727204064.94484: variable 'controller_profile' from source: play vars 10215 1727204064.94529: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10215 1727204064.94763: when evaluation is False, skipping this task 10215 1727204064.94766: _execute() done 10215 1727204064.94768: dumping result to json 10215 1727204064.94770: done dumping result, returning 10215 1727204064.94772: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-3c74-8f8e-000000000083] 10215 1727204064.94774: sending task result for task 12b410aa-8751-3c74-8f8e-000000000083 10215 1727204064.94844: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000083 10215 1727204064.94848: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10215 1727204064.94931: no more pending results, returning what we have 10215 1727204064.94936: results queue empty 10215 1727204064.94937: checking for any_errors_fatal 10215 1727204064.94944: done checking for any_errors_fatal 10215 1727204064.94946: checking for max_fail_percentage 10215 1727204064.94948: done checking for max_fail_percentage 10215 1727204064.94950: checking to see if all hosts have failed and the running result is not ok 10215 1727204064.94951: done checking to see if all hosts have failed 10215 1727204064.94952: getting the remaining hosts for this loop 10215 1727204064.94954: done getting the remaining hosts for this loop 10215 1727204064.94959: getting the next task for host managed-node3 10215 1727204064.94968: done getting next task for host managed-node3 10215 1727204064.94973: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 10215 1727204064.94978: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204064.95004: getting variables 10215 1727204064.95006: in VariableManager get_vars() 10215 1727204064.95058: Calling all_inventory to load vars for managed-node3 10215 1727204064.95062: Calling groups_inventory to load vars for managed-node3 10215 1727204064.95065: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204064.95077: Calling all_plugins_play to load vars for managed-node3 10215 1727204064.95081: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204064.95085: Calling groups_plugins_play to load vars for managed-node3 10215 1727204065.09758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204065.13021: done with get_vars() 10215 1727204065.13085: done getting variables 10215 1727204065.13156: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.330) 0:00:33.699 ***** 10215 1727204065.13197: entering _queue_task() for managed-node3/package 10215 1727204065.13768: worker is 1 (out of 1 available) 10215 1727204065.13782: exiting _queue_task() for managed-node3/package 10215 1727204065.13796: done queuing things up, now waiting for results queue to drain 10215 1727204065.13798: waiting for pending results... 10215 1727204065.14314: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages 10215 1727204065.14320: in run() - task 12b410aa-8751-3c74-8f8e-000000000084 10215 1727204065.14325: variable 'ansible_search_path' from source: unknown 10215 1727204065.14329: variable 'ansible_search_path' from source: unknown 10215 1727204065.14596: calling self._execute() 10215 1727204065.14600: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204065.14605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204065.14611: variable 'omit' from source: magic vars 10215 1727204065.14921: variable 'ansible_distribution_major_version' from source: facts 10215 1727204065.14937: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204065.15343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204065.16062: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204065.16488: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204065.16495: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204065.16663: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204065.16867: variable 'network_packages' from source: role '' defaults 10215 1727204065.17042: variable '__network_provider_setup' from source: role '' defaults 10215 1727204065.17051: variable '__network_service_name_default_nm' from source: role '' defaults 10215 1727204065.17142: variable '__network_service_name_default_nm' from source: role '' defaults 10215 1727204065.17152: variable '__network_packages_default_nm' from source: role '' defaults 10215 1727204065.17231: variable '__network_packages_default_nm' from source: role '' defaults 10215 1727204065.17483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204065.20741: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204065.20910: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204065.20953: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204065.20995: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204065.21203: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204065.21597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204065.21648: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204065.21797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.22044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204065.22048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204065.22051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204065.22074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204065.22334: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.22385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204065.22405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204065.23174: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10215 1727204065.23338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204065.23374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204065.23420: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.23481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204065.23510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204065.23634: variable 'ansible_python' from source: facts 10215 1727204065.23795: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10215 1727204065.23798: variable '__network_wpa_supplicant_required' from source: role '' defaults 10215 1727204065.23896: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10215 1727204065.24066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204065.24105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204065.24143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.24200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204065.24230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204065.24297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204065.24347: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204065.24385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.24447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204065.24473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204065.24826: variable 'network_connections' from source: task vars 10215 1727204065.24830: variable 'port2_profile' from source: play vars 10215 1727204065.24833: variable 'port2_profile' from source: play vars 10215 1727204065.24847: variable 'port1_profile' from source: play vars 10215 1727204065.25296: variable 'port1_profile' from source: play vars 10215 1727204065.25300: variable 'controller_profile' from source: play vars 10215 1727204065.25376: variable 'controller_profile' from source: play vars 10215 1727204065.25481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204065.25547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204065.25593: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.25645: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204065.25756: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204065.26521: variable 'network_connections' from source: task vars 10215 1727204065.26534: variable 'port2_profile' from source: play vars 10215 1727204065.26669: variable 'port2_profile' from source: play vars 10215 1727204065.26688: variable 'port1_profile' from source: play vars 10215 1727204065.26819: variable 'port1_profile' from source: play vars 10215 1727204065.26843: variable 'controller_profile' from source: play vars 10215 1727204065.26970: variable 'controller_profile' from source: play vars 10215 1727204065.27021: variable '__network_packages_default_wireless' from source: role '' defaults 10215 1727204065.27135: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204065.27597: variable 'network_connections' from source: task vars 10215 1727204065.27601: variable 'port2_profile' from source: play vars 10215 1727204065.27673: variable 'port2_profile' from source: play vars 10215 1727204065.27688: variable 'port1_profile' from source: play vars 10215 1727204065.27894: variable 'port1_profile' from source: play vars 10215 1727204065.27898: variable 'controller_profile' from source: play vars 10215 1727204065.27900: variable 'controller_profile' from source: play vars 10215 1727204065.27914: variable '__network_packages_default_team' from source: role '' defaults 10215 1727204065.28027: variable '__network_team_connections_defined' from source: role '' defaults 10215 1727204065.28493: variable 'network_connections' from source: task vars 10215 1727204065.28506: variable 'port2_profile' from source: play vars 10215 1727204065.28598: variable 'port2_profile' from source: play vars 10215 1727204065.28617: variable 'port1_profile' from source: play vars 10215 1727204065.28705: variable 'port1_profile' from source: play vars 10215 1727204065.28724: variable 'controller_profile' from source: play vars 10215 1727204065.28816: variable 'controller_profile' from source: play vars 10215 1727204065.28896: variable '__network_service_name_default_initscripts' from source: role '' defaults 10215 1727204065.28977: variable '__network_service_name_default_initscripts' from source: role '' defaults 10215 1727204065.28996: variable '__network_packages_default_initscripts' from source: role '' defaults 10215 1727204065.29082: variable '__network_packages_default_initscripts' from source: role '' defaults 10215 1727204065.29409: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10215 1727204065.30019: variable 'network_connections' from source: task vars 10215 1727204065.30023: variable 'port2_profile' from source: play vars 10215 1727204065.30072: variable 'port2_profile' from source: play vars 10215 1727204065.30080: variable 'port1_profile' from source: play vars 10215 1727204065.30132: variable 'port1_profile' from source: play vars 10215 1727204065.30140: variable 'controller_profile' from source: play vars 10215 1727204065.30188: variable 'controller_profile' from source: play vars 10215 1727204065.30200: variable 'ansible_distribution' from source: facts 10215 1727204065.30215: variable '__network_rh_distros' from source: role '' defaults 10215 1727204065.30219: variable 'ansible_distribution_major_version' from source: facts 10215 1727204065.30248: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10215 1727204065.30379: variable 'ansible_distribution' from source: facts 10215 1727204065.30382: variable '__network_rh_distros' from source: role '' defaults 10215 1727204065.30388: variable 'ansible_distribution_major_version' from source: facts 10215 1727204065.30398: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10215 1727204065.30538: variable 'ansible_distribution' from source: facts 10215 1727204065.30541: variable '__network_rh_distros' from source: role '' defaults 10215 1727204065.30548: variable 'ansible_distribution_major_version' from source: facts 10215 1727204065.30577: variable 'network_provider' from source: set_fact 10215 1727204065.30599: variable 'ansible_facts' from source: unknown 10215 1727204065.31397: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 10215 1727204065.31400: when evaluation is False, skipping this task 10215 1727204065.31403: _execute() done 10215 1727204065.31405: dumping result to json 10215 1727204065.31411: done dumping result, returning 10215 1727204065.31414: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-3c74-8f8e-000000000084] 10215 1727204065.31594: sending task result for task 12b410aa-8751-3c74-8f8e-000000000084 10215 1727204065.31670: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000084 10215 1727204065.31673: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 10215 1727204065.31725: no more pending results, returning what we have 10215 1727204065.31728: results queue empty 10215 1727204065.31729: checking for any_errors_fatal 10215 1727204065.31736: done checking for any_errors_fatal 10215 1727204065.31737: checking for max_fail_percentage 10215 1727204065.31739: done checking for max_fail_percentage 10215 1727204065.31740: checking to see if all hosts have failed and the running result is not ok 10215 1727204065.31741: done checking to see if all hosts have failed 10215 1727204065.31742: getting the remaining hosts for this loop 10215 1727204065.31743: done getting the remaining hosts for this loop 10215 1727204065.31752: getting the next task for host managed-node3 10215 1727204065.31758: done getting next task for host managed-node3 10215 1727204065.31762: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10215 1727204065.31766: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204065.31785: getting variables 10215 1727204065.31787: in VariableManager get_vars() 10215 1727204065.31838: Calling all_inventory to load vars for managed-node3 10215 1727204065.31841: Calling groups_inventory to load vars for managed-node3 10215 1727204065.31843: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204065.31854: Calling all_plugins_play to load vars for managed-node3 10215 1727204065.31857: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204065.31861: Calling groups_plugins_play to load vars for managed-node3 10215 1727204065.33348: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204065.34995: done with get_vars() 10215 1727204065.35019: done getting variables 10215 1727204065.35072: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.219) 0:00:33.918 ***** 10215 1727204065.35102: entering _queue_task() for managed-node3/package 10215 1727204065.35363: worker is 1 (out of 1 available) 10215 1727204065.35381: exiting _queue_task() for managed-node3/package 10215 1727204065.35396: done queuing things up, now waiting for results queue to drain 10215 1727204065.35399: waiting for pending results... 10215 1727204065.35587: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 10215 1727204065.35719: in run() - task 12b410aa-8751-3c74-8f8e-000000000085 10215 1727204065.35735: variable 'ansible_search_path' from source: unknown 10215 1727204065.35741: variable 'ansible_search_path' from source: unknown 10215 1727204065.35771: calling self._execute() 10215 1727204065.35854: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204065.35859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204065.35871: variable 'omit' from source: magic vars 10215 1727204065.36201: variable 'ansible_distribution_major_version' from source: facts 10215 1727204065.36213: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204065.36314: variable 'network_state' from source: role '' defaults 10215 1727204065.36325: Evaluated conditional (network_state != {}): False 10215 1727204065.36328: when evaluation is False, skipping this task 10215 1727204065.36331: _execute() done 10215 1727204065.36335: dumping result to json 10215 1727204065.36340: done dumping result, returning 10215 1727204065.36349: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-3c74-8f8e-000000000085] 10215 1727204065.36355: sending task result for task 12b410aa-8751-3c74-8f8e-000000000085 10215 1727204065.36454: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000085 10215 1727204065.36456: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10215 1727204065.36510: no more pending results, returning what we have 10215 1727204065.36514: results queue empty 10215 1727204065.36515: checking for any_errors_fatal 10215 1727204065.36521: done checking for any_errors_fatal 10215 1727204065.36522: checking for max_fail_percentage 10215 1727204065.36523: done checking for max_fail_percentage 10215 1727204065.36525: checking to see if all hosts have failed and the running result is not ok 10215 1727204065.36526: done checking to see if all hosts have failed 10215 1727204065.36526: getting the remaining hosts for this loop 10215 1727204065.36528: done getting the remaining hosts for this loop 10215 1727204065.36532: getting the next task for host managed-node3 10215 1727204065.36540: done getting next task for host managed-node3 10215 1727204065.36544: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10215 1727204065.36548: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204065.36567: getting variables 10215 1727204065.36569: in VariableManager get_vars() 10215 1727204065.36609: Calling all_inventory to load vars for managed-node3 10215 1727204065.36612: Calling groups_inventory to load vars for managed-node3 10215 1727204065.36615: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204065.36625: Calling all_plugins_play to load vars for managed-node3 10215 1727204065.36628: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204065.36631: Calling groups_plugins_play to load vars for managed-node3 10215 1727204065.37782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204065.39491: done with get_vars() 10215 1727204065.39523: done getting variables 10215 1727204065.39587: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.045) 0:00:33.963 ***** 10215 1727204065.39627: entering _queue_task() for managed-node3/package 10215 1727204065.39926: worker is 1 (out of 1 available) 10215 1727204065.39940: exiting _queue_task() for managed-node3/package 10215 1727204065.39954: done queuing things up, now waiting for results queue to drain 10215 1727204065.39955: waiting for pending results... 10215 1727204065.40410: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 10215 1727204065.40432: in run() - task 12b410aa-8751-3c74-8f8e-000000000086 10215 1727204065.40452: variable 'ansible_search_path' from source: unknown 10215 1727204065.40460: variable 'ansible_search_path' from source: unknown 10215 1727204065.40508: calling self._execute() 10215 1727204065.40612: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204065.40628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204065.40643: variable 'omit' from source: magic vars 10215 1727204065.41064: variable 'ansible_distribution_major_version' from source: facts 10215 1727204065.41081: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204065.41214: variable 'network_state' from source: role '' defaults 10215 1727204065.41223: Evaluated conditional (network_state != {}): False 10215 1727204065.41226: when evaluation is False, skipping this task 10215 1727204065.41230: _execute() done 10215 1727204065.41234: dumping result to json 10215 1727204065.41238: done dumping result, returning 10215 1727204065.41252: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-3c74-8f8e-000000000086] 10215 1727204065.41256: sending task result for task 12b410aa-8751-3c74-8f8e-000000000086 10215 1727204065.41358: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000086 10215 1727204065.41362: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10215 1727204065.41426: no more pending results, returning what we have 10215 1727204065.41430: results queue empty 10215 1727204065.41431: checking for any_errors_fatal 10215 1727204065.41439: done checking for any_errors_fatal 10215 1727204065.41439: checking for max_fail_percentage 10215 1727204065.41441: done checking for max_fail_percentage 10215 1727204065.41442: checking to see if all hosts have failed and the running result is not ok 10215 1727204065.41443: done checking to see if all hosts have failed 10215 1727204065.41444: getting the remaining hosts for this loop 10215 1727204065.41446: done getting the remaining hosts for this loop 10215 1727204065.41450: getting the next task for host managed-node3 10215 1727204065.41457: done getting next task for host managed-node3 10215 1727204065.41461: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10215 1727204065.41470: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204065.41490: getting variables 10215 1727204065.41492: in VariableManager get_vars() 10215 1727204065.41533: Calling all_inventory to load vars for managed-node3 10215 1727204065.41537: Calling groups_inventory to load vars for managed-node3 10215 1727204065.41540: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204065.41551: Calling all_plugins_play to load vars for managed-node3 10215 1727204065.41554: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204065.41557: Calling groups_plugins_play to load vars for managed-node3 10215 1727204065.42857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204065.44402: done with get_vars() 10215 1727204065.44425: done getting variables 10215 1727204065.44500: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.049) 0:00:34.012 ***** 10215 1727204065.44532: entering _queue_task() for managed-node3/service 10215 1727204065.44781: worker is 1 (out of 1 available) 10215 1727204065.44799: exiting _queue_task() for managed-node3/service 10215 1727204065.44815: done queuing things up, now waiting for results queue to drain 10215 1727204065.44818: waiting for pending results... 10215 1727204065.45005: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 10215 1727204065.45120: in run() - task 12b410aa-8751-3c74-8f8e-000000000087 10215 1727204065.45132: variable 'ansible_search_path' from source: unknown 10215 1727204065.45137: variable 'ansible_search_path' from source: unknown 10215 1727204065.45173: calling self._execute() 10215 1727204065.45256: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204065.45260: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204065.45274: variable 'omit' from source: magic vars 10215 1727204065.45594: variable 'ansible_distribution_major_version' from source: facts 10215 1727204065.45606: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204065.45709: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204065.45873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204065.48017: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204065.48077: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204065.48108: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204065.48140: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204065.48168: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204065.48236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204065.48262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204065.48285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.48322: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204065.48335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204065.48378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204065.48400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204065.48424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.48455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204065.48467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204065.48507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204065.48528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204065.48548: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.48577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204065.48591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204065.48737: variable 'network_connections' from source: task vars 10215 1727204065.48749: variable 'port2_profile' from source: play vars 10215 1727204065.48802: variable 'port2_profile' from source: play vars 10215 1727204065.48815: variable 'port1_profile' from source: play vars 10215 1727204065.48865: variable 'port1_profile' from source: play vars 10215 1727204065.48875: variable 'controller_profile' from source: play vars 10215 1727204065.48929: variable 'controller_profile' from source: play vars 10215 1727204065.48988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204065.49132: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204065.49168: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204065.49196: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204065.49224: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204065.49263: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204065.49284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204065.49311: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.49331: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204065.49380: variable '__network_team_connections_defined' from source: role '' defaults 10215 1727204065.49675: variable 'network_connections' from source: task vars 10215 1727204065.49678: variable 'port2_profile' from source: play vars 10215 1727204065.49715: variable 'port2_profile' from source: play vars 10215 1727204065.49734: variable 'port1_profile' from source: play vars 10215 1727204065.49910: variable 'port1_profile' from source: play vars 10215 1727204065.49926: variable 'controller_profile' from source: play vars 10215 1727204065.50006: variable 'controller_profile' from source: play vars 10215 1727204065.50041: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 10215 1727204065.50060: when evaluation is False, skipping this task 10215 1727204065.50068: _execute() done 10215 1727204065.50076: dumping result to json 10215 1727204065.50195: done dumping result, returning 10215 1727204065.50198: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-3c74-8f8e-000000000087] 10215 1727204065.50201: sending task result for task 12b410aa-8751-3c74-8f8e-000000000087 10215 1727204065.50268: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000087 10215 1727204065.50270: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 10215 1727204065.50380: no more pending results, returning what we have 10215 1727204065.50384: results queue empty 10215 1727204065.50385: checking for any_errors_fatal 10215 1727204065.50393: done checking for any_errors_fatal 10215 1727204065.50394: checking for max_fail_percentage 10215 1727204065.50396: done checking for max_fail_percentage 10215 1727204065.50397: checking to see if all hosts have failed and the running result is not ok 10215 1727204065.50398: done checking to see if all hosts have failed 10215 1727204065.50399: getting the remaining hosts for this loop 10215 1727204065.50401: done getting the remaining hosts for this loop 10215 1727204065.50405: getting the next task for host managed-node3 10215 1727204065.50416: done getting next task for host managed-node3 10215 1727204065.50419: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10215 1727204065.50423: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204065.50441: getting variables 10215 1727204065.50443: in VariableManager get_vars() 10215 1727204065.50483: Calling all_inventory to load vars for managed-node3 10215 1727204065.50486: Calling groups_inventory to load vars for managed-node3 10215 1727204065.50488: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204065.50646: Calling all_plugins_play to load vars for managed-node3 10215 1727204065.50650: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204065.50654: Calling groups_plugins_play to load vars for managed-node3 10215 1727204065.52765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204065.55902: done with get_vars() 10215 1727204065.55936: done getting variables 10215 1727204065.56013: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:25 -0400 (0:00:00.115) 0:00:34.127 ***** 10215 1727204065.56053: entering _queue_task() for managed-node3/service 10215 1727204065.56421: worker is 1 (out of 1 available) 10215 1727204065.56440: exiting _queue_task() for managed-node3/service 10215 1727204065.56455: done queuing things up, now waiting for results queue to drain 10215 1727204065.56457: waiting for pending results... 10215 1727204065.56806: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 10215 1727204065.56998: in run() - task 12b410aa-8751-3c74-8f8e-000000000088 10215 1727204065.57095: variable 'ansible_search_path' from source: unknown 10215 1727204065.57099: variable 'ansible_search_path' from source: unknown 10215 1727204065.57102: calling self._execute() 10215 1727204065.57208: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204065.57223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204065.57240: variable 'omit' from source: magic vars 10215 1727204065.57703: variable 'ansible_distribution_major_version' from source: facts 10215 1727204065.57723: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204065.57946: variable 'network_provider' from source: set_fact 10215 1727204065.57958: variable 'network_state' from source: role '' defaults 10215 1727204065.57975: Evaluated conditional (network_provider == "nm" or network_state != {}): True 10215 1727204065.57987: variable 'omit' from source: magic vars 10215 1727204065.58086: variable 'omit' from source: magic vars 10215 1727204065.58242: variable 'network_service_name' from source: role '' defaults 10215 1727204065.58246: variable 'network_service_name' from source: role '' defaults 10215 1727204065.58371: variable '__network_provider_setup' from source: role '' defaults 10215 1727204065.58383: variable '__network_service_name_default_nm' from source: role '' defaults 10215 1727204065.58460: variable '__network_service_name_default_nm' from source: role '' defaults 10215 1727204065.58473: variable '__network_packages_default_nm' from source: role '' defaults 10215 1727204065.58563: variable '__network_packages_default_nm' from source: role '' defaults 10215 1727204065.58872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204065.62275: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204065.62381: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204065.62442: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204065.62522: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204065.62565: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204065.62727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204065.62776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204065.62818: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.62882: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204065.62978: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204065.62981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204065.63013: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204065.63047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.63108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204065.63132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204065.63586: variable '__network_packages_default_gobject_packages' from source: role '' defaults 10215 1727204065.63859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204065.63897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204065.63933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.63994: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204065.64050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204065.64310: variable 'ansible_python' from source: facts 10215 1727204065.64314: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 10215 1727204065.64419: variable '__network_wpa_supplicant_required' from source: role '' defaults 10215 1727204065.64531: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10215 1727204065.64703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204065.64744: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204065.64862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.64920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204065.65018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204065.65093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204065.65155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204065.65255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.65260: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204065.65272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204065.65485: variable 'network_connections' from source: task vars 10215 1727204065.65488: variable 'port2_profile' from source: play vars 10215 1727204065.65577: variable 'port2_profile' from source: play vars 10215 1727204065.65696: variable 'port1_profile' from source: play vars 10215 1727204065.65700: variable 'port1_profile' from source: play vars 10215 1727204065.65715: variable 'controller_profile' from source: play vars 10215 1727204065.65807: variable 'controller_profile' from source: play vars 10215 1727204065.65941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204065.66198: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204065.66265: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204065.66320: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204065.66469: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204065.66473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204065.66496: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204065.66542: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204065.66595: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204065.66655: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204065.67037: variable 'network_connections' from source: task vars 10215 1727204065.67051: variable 'port2_profile' from source: play vars 10215 1727204065.67144: variable 'port2_profile' from source: play vars 10215 1727204065.67168: variable 'port1_profile' from source: play vars 10215 1727204065.67262: variable 'port1_profile' from source: play vars 10215 1727204065.67280: variable 'controller_profile' from source: play vars 10215 1727204065.67363: variable 'controller_profile' from source: play vars 10215 1727204065.67412: variable '__network_packages_default_wireless' from source: role '' defaults 10215 1727204065.67505: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204065.68029: variable 'network_connections' from source: task vars 10215 1727204065.68035: variable 'port2_profile' from source: play vars 10215 1727204065.68195: variable 'port2_profile' from source: play vars 10215 1727204065.68198: variable 'port1_profile' from source: play vars 10215 1727204065.68254: variable 'port1_profile' from source: play vars 10215 1727204065.68270: variable 'controller_profile' from source: play vars 10215 1727204065.68376: variable 'controller_profile' from source: play vars 10215 1727204065.68408: variable '__network_packages_default_team' from source: role '' defaults 10215 1727204065.68515: variable '__network_team_connections_defined' from source: role '' defaults 10215 1727204065.69097: variable 'network_connections' from source: task vars 10215 1727204065.69101: variable 'port2_profile' from source: play vars 10215 1727204065.69251: variable 'port2_profile' from source: play vars 10215 1727204065.69260: variable 'port1_profile' from source: play vars 10215 1727204065.69458: variable 'port1_profile' from source: play vars 10215 1727204065.69467: variable 'controller_profile' from source: play vars 10215 1727204065.69655: variable 'controller_profile' from source: play vars 10215 1727204065.69746: variable '__network_service_name_default_initscripts' from source: role '' defaults 10215 1727204065.69854: variable '__network_service_name_default_initscripts' from source: role '' defaults 10215 1727204065.69865: variable '__network_packages_default_initscripts' from source: role '' defaults 10215 1727204065.69958: variable '__network_packages_default_initscripts' from source: role '' defaults 10215 1727204065.70271: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 10215 1727204065.71169: variable 'network_connections' from source: task vars 10215 1727204065.71177: variable 'port2_profile' from source: play vars 10215 1727204065.71255: variable 'port2_profile' from source: play vars 10215 1727204065.71286: variable 'port1_profile' from source: play vars 10215 1727204065.71338: variable 'port1_profile' from source: play vars 10215 1727204065.71347: variable 'controller_profile' from source: play vars 10215 1727204065.71423: variable 'controller_profile' from source: play vars 10215 1727204065.71433: variable 'ansible_distribution' from source: facts 10215 1727204065.71437: variable '__network_rh_distros' from source: role '' defaults 10215 1727204065.71445: variable 'ansible_distribution_major_version' from source: facts 10215 1727204065.71459: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 10215 1727204065.71606: variable 'ansible_distribution' from source: facts 10215 1727204065.71611: variable '__network_rh_distros' from source: role '' defaults 10215 1727204065.71619: variable 'ansible_distribution_major_version' from source: facts 10215 1727204065.71628: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 10215 1727204065.71767: variable 'ansible_distribution' from source: facts 10215 1727204065.71771: variable '__network_rh_distros' from source: role '' defaults 10215 1727204065.71777: variable 'ansible_distribution_major_version' from source: facts 10215 1727204065.71809: variable 'network_provider' from source: set_fact 10215 1727204065.71834: variable 'omit' from source: magic vars 10215 1727204065.71859: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204065.71883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204065.71902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204065.71921: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204065.71930: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204065.71960: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204065.71964: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204065.71969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204065.72054: Set connection var ansible_connection to ssh 10215 1727204065.72063: Set connection var ansible_pipelining to False 10215 1727204065.72069: Set connection var ansible_shell_type to sh 10215 1727204065.72076: Set connection var ansible_timeout to 10 10215 1727204065.72082: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204065.72092: Set connection var ansible_shell_executable to /bin/sh 10215 1727204065.72116: variable 'ansible_shell_executable' from source: unknown 10215 1727204065.72119: variable 'ansible_connection' from source: unknown 10215 1727204065.72122: variable 'ansible_module_compression' from source: unknown 10215 1727204065.72126: variable 'ansible_shell_type' from source: unknown 10215 1727204065.72129: variable 'ansible_shell_executable' from source: unknown 10215 1727204065.72134: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204065.72138: variable 'ansible_pipelining' from source: unknown 10215 1727204065.72142: variable 'ansible_timeout' from source: unknown 10215 1727204065.72147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204065.72238: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204065.72248: variable 'omit' from source: magic vars 10215 1727204065.72253: starting attempt loop 10215 1727204065.72256: running the handler 10215 1727204065.72327: variable 'ansible_facts' from source: unknown 10215 1727204065.73452: _low_level_execute_command(): starting 10215 1727204065.73456: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204065.73982: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204065.73998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204065.74002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204065.74031: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 10215 1727204065.74034: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204065.74037: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204065.74096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204065.74110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204065.74150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204065.75903: stdout chunk (state=3): >>>/root <<< 10215 1727204065.76028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204065.76080: stderr chunk (state=3): >>><<< 10215 1727204065.76084: stdout chunk (state=3): >>><<< 10215 1727204065.76179: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204065.76183: _low_level_execute_command(): starting 10215 1727204065.76187: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894 `" && echo ansible-tmp-1727204065.7611637-11990-274223473968894="` echo /root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894 `" ) && sleep 0' 10215 1727204065.76796: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204065.76799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204065.76846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204065.76849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204065.76852: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204065.76855: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204065.76861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204065.76953: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10215 1727204065.76957: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 10215 1727204065.76960: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10215 1727204065.76962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204065.76993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204065.77009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204065.77076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204065.79036: stdout chunk (state=3): >>>ansible-tmp-1727204065.7611637-11990-274223473968894=/root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894 <<< 10215 1727204065.79167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204065.79252: stderr chunk (state=3): >>><<< 10215 1727204065.79257: stdout chunk (state=3): >>><<< 10215 1727204065.79278: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204065.7611637-11990-274223473968894=/root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204065.79494: variable 'ansible_module_compression' from source: unknown 10215 1727204065.79498: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 10215 1727204065.79500: variable 'ansible_facts' from source: unknown 10215 1727204065.79667: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894/AnsiballZ_systemd.py 10215 1727204065.79918: Sending initial data 10215 1727204065.79922: Sent initial data (156 bytes) 10215 1727204065.80415: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204065.80426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204065.80505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204065.80540: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204065.80552: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204065.80565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204065.80634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204065.82256: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 10215 1727204065.82261: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204065.82293: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204065.82328: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpc5qgy_6_ /root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894/AnsiballZ_systemd.py <<< 10215 1727204065.82344: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894/AnsiballZ_systemd.py" <<< 10215 1727204065.82364: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpc5qgy_6_" to remote "/root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894/AnsiballZ_systemd.py" <<< 10215 1727204065.82371: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894/AnsiballZ_systemd.py" <<< 10215 1727204065.84310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204065.84396: stderr chunk (state=3): >>><<< 10215 1727204065.84400: stdout chunk (state=3): >>><<< 10215 1727204065.84402: done transferring module to remote 10215 1727204065.84405: _low_level_execute_command(): starting 10215 1727204065.84407: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894/ /root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894/AnsiballZ_systemd.py && sleep 0' 10215 1727204065.84834: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204065.84838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204065.84841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204065.84843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204065.84899: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204065.84904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204065.84942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204065.86785: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204065.86832: stderr chunk (state=3): >>><<< 10215 1727204065.86836: stdout chunk (state=3): >>><<< 10215 1727204065.86849: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204065.86852: _low_level_execute_command(): starting 10215 1727204065.86858: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894/AnsiballZ_systemd.py && sleep 0' 10215 1727204065.87259: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204065.87295: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204065.87298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204065.87304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204065.87306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204065.87309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204065.87361: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204065.87368: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204065.87407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204066.20512: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11853824", "MemoryAvailable": "infinity", "CPUUsageNSec": "740585000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "inf<<< 10215 1727204066.20522: stdout chunk (state=3): >>>inity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service cloud-init.service network.service shutdown.target multi-user.target", "After": "systemd-journald.socket dbus-broker.service system.slice cloud-init-local.service basic.target dbus.socket sysinit.target network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:14 EDT", "StateChangeTimestampMonotonic": "499035810", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 10215 1727204066.22527: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204066.22531: stdout chunk (state=3): >>><<< 10215 1727204066.22534: stderr chunk (state=3): >>><<< 10215 1727204066.22552: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "647", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ExecMainStartTimestampMonotonic": "28911103", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "647", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11853824", "MemoryAvailable": "infinity", "CPUUsageNSec": "740585000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.target NetworkManager-wait-online.service cloud-init.service network.service shutdown.target multi-user.target", "After": "systemd-journald.socket dbus-broker.service system.slice cloud-init-local.service basic.target dbus.socket sysinit.target network-pre.target", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:14 EDT", "StateChangeTimestampMonotonic": "499035810", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:24 EDT", "InactiveExitTimestampMonotonic": "28911342", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:25 EDT", "ActiveEnterTimestampMonotonic": "29816317", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:24 EDT", "ConditionTimestampMonotonic": "28901880", "AssertTimestamp": "Tue 2024-09-24 14:45:24 EDT", "AssertTimestampMonotonic": "28901883", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "b6a383b318af414f897f5e2227729b18", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204066.23230: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204066.23376: _low_level_execute_command(): starting 10215 1727204066.23640: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204065.7611637-11990-274223473968894/ > /dev/null 2>&1 && sleep 0' 10215 1727204066.24900: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204066.24916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204066.24931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204066.25102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204066.25343: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204066.25412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204066.27467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204066.27471: stdout chunk (state=3): >>><<< 10215 1727204066.27474: stderr chunk (state=3): >>><<< 10215 1727204066.27761: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204066.27765: handler run complete 10215 1727204066.27881: attempt loop complete, returning result 10215 1727204066.27894: _execute() done 10215 1727204066.28295: dumping result to json 10215 1727204066.28299: done dumping result, returning 10215 1727204066.28303: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-3c74-8f8e-000000000088] 10215 1727204066.28306: sending task result for task 12b410aa-8751-3c74-8f8e-000000000088 ok: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10215 1727204066.28927: no more pending results, returning what we have 10215 1727204066.28932: results queue empty 10215 1727204066.28934: checking for any_errors_fatal 10215 1727204066.28941: done checking for any_errors_fatal 10215 1727204066.28942: checking for max_fail_percentage 10215 1727204066.28945: done checking for max_fail_percentage 10215 1727204066.28946: checking to see if all hosts have failed and the running result is not ok 10215 1727204066.28947: done checking to see if all hosts have failed 10215 1727204066.28948: getting the remaining hosts for this loop 10215 1727204066.28950: done getting the remaining hosts for this loop 10215 1727204066.28955: getting the next task for host managed-node3 10215 1727204066.28964: done getting next task for host managed-node3 10215 1727204066.28969: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10215 1727204066.28974: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204066.28991: getting variables 10215 1727204066.28994: in VariableManager get_vars() 10215 1727204066.29041: Calling all_inventory to load vars for managed-node3 10215 1727204066.29044: Calling groups_inventory to load vars for managed-node3 10215 1727204066.29048: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204066.29061: Calling all_plugins_play to load vars for managed-node3 10215 1727204066.29066: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204066.29070: Calling groups_plugins_play to load vars for managed-node3 10215 1727204066.30409: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000088 10215 1727204066.30414: WORKER PROCESS EXITING 10215 1727204066.37218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204066.43498: done with get_vars() 10215 1727204066.43545: done getting variables 10215 1727204066.43620: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.876) 0:00:35.003 ***** 10215 1727204066.43661: entering _queue_task() for managed-node3/service 10215 1727204066.44448: worker is 1 (out of 1 available) 10215 1727204066.44464: exiting _queue_task() for managed-node3/service 10215 1727204066.44478: done queuing things up, now waiting for results queue to drain 10215 1727204066.44480: waiting for pending results... 10215 1727204066.45006: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 10215 1727204066.45292: in run() - task 12b410aa-8751-3c74-8f8e-000000000089 10215 1727204066.45416: variable 'ansible_search_path' from source: unknown 10215 1727204066.45426: variable 'ansible_search_path' from source: unknown 10215 1727204066.45473: calling self._execute() 10215 1727204066.45896: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204066.45900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204066.45904: variable 'omit' from source: magic vars 10215 1727204066.46666: variable 'ansible_distribution_major_version' from source: facts 10215 1727204066.46712: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204066.47042: variable 'network_provider' from source: set_fact 10215 1727204066.47054: Evaluated conditional (network_provider == "nm"): True 10215 1727204066.47173: variable '__network_wpa_supplicant_required' from source: role '' defaults 10215 1727204066.47505: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 10215 1727204066.48098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204066.50793: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204066.51052: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204066.51101: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204066.51364: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204066.51367: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204066.51371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204066.51373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204066.51529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204066.51588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204066.51895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204066.51899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204066.51915: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204066.51955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204066.52012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204066.52295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204066.52298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204066.52304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204066.52341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204066.52399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204066.52694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204066.52801: variable 'network_connections' from source: task vars 10215 1727204066.53094: variable 'port2_profile' from source: play vars 10215 1727204066.53098: variable 'port2_profile' from source: play vars 10215 1727204066.53116: variable 'port1_profile' from source: play vars 10215 1727204066.53188: variable 'port1_profile' from source: play vars 10215 1727204066.53494: variable 'controller_profile' from source: play vars 10215 1727204066.53498: variable 'controller_profile' from source: play vars 10215 1727204066.53592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 10215 1727204066.54001: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 10215 1727204066.54240: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 10215 1727204066.54282: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 10215 1727204066.54325: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 10215 1727204066.54384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 10215 1727204066.54624: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 10215 1727204066.54660: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204066.54702: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 10215 1727204066.54766: variable '__network_wireless_connections_defined' from source: role '' defaults 10215 1727204066.55506: variable 'network_connections' from source: task vars 10215 1727204066.55519: variable 'port2_profile' from source: play vars 10215 1727204066.55597: variable 'port2_profile' from source: play vars 10215 1727204066.55895: variable 'port1_profile' from source: play vars 10215 1727204066.55898: variable 'port1_profile' from source: play vars 10215 1727204066.55910: variable 'controller_profile' from source: play vars 10215 1727204066.55980: variable 'controller_profile' from source: play vars 10215 1727204066.56024: Evaluated conditional (__network_wpa_supplicant_required): False 10215 1727204066.56294: when evaluation is False, skipping this task 10215 1727204066.56298: _execute() done 10215 1727204066.56300: dumping result to json 10215 1727204066.56302: done dumping result, returning 10215 1727204066.56305: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-3c74-8f8e-000000000089] 10215 1727204066.56307: sending task result for task 12b410aa-8751-3c74-8f8e-000000000089 10215 1727204066.56383: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000089 10215 1727204066.56388: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 10215 1727204066.56444: no more pending results, returning what we have 10215 1727204066.56448: results queue empty 10215 1727204066.56450: checking for any_errors_fatal 10215 1727204066.56472: done checking for any_errors_fatal 10215 1727204066.56473: checking for max_fail_percentage 10215 1727204066.56475: done checking for max_fail_percentage 10215 1727204066.56476: checking to see if all hosts have failed and the running result is not ok 10215 1727204066.56477: done checking to see if all hosts have failed 10215 1727204066.56478: getting the remaining hosts for this loop 10215 1727204066.56480: done getting the remaining hosts for this loop 10215 1727204066.56484: getting the next task for host managed-node3 10215 1727204066.56494: done getting next task for host managed-node3 10215 1727204066.56498: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 10215 1727204066.56503: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204066.56526: getting variables 10215 1727204066.56528: in VariableManager get_vars() 10215 1727204066.56575: Calling all_inventory to load vars for managed-node3 10215 1727204066.56578: Calling groups_inventory to load vars for managed-node3 10215 1727204066.56581: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204066.56796: Calling all_plugins_play to load vars for managed-node3 10215 1727204066.56801: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204066.56806: Calling groups_plugins_play to load vars for managed-node3 10215 1727204066.61310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204066.67368: done with get_vars() 10215 1727204066.67619: done getting variables 10215 1727204066.67896: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.242) 0:00:35.246 ***** 10215 1727204066.67940: entering _queue_task() for managed-node3/service 10215 1727204066.68722: worker is 1 (out of 1 available) 10215 1727204066.68736: exiting _queue_task() for managed-node3/service 10215 1727204066.68749: done queuing things up, now waiting for results queue to drain 10215 1727204066.68750: waiting for pending results... 10215 1727204066.69106: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service 10215 1727204066.69596: in run() - task 12b410aa-8751-3c74-8f8e-00000000008a 10215 1727204066.69600: variable 'ansible_search_path' from source: unknown 10215 1727204066.69603: variable 'ansible_search_path' from source: unknown 10215 1727204066.69606: calling self._execute() 10215 1727204066.69830: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204066.69844: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204066.69863: variable 'omit' from source: magic vars 10215 1727204066.70714: variable 'ansible_distribution_major_version' from source: facts 10215 1727204066.70734: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204066.71038: variable 'network_provider' from source: set_fact 10215 1727204066.71051: Evaluated conditional (network_provider == "initscripts"): False 10215 1727204066.71395: when evaluation is False, skipping this task 10215 1727204066.71401: _execute() done 10215 1727204066.71404: dumping result to json 10215 1727204066.71407: done dumping result, returning 10215 1727204066.71414: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-3c74-8f8e-00000000008a] 10215 1727204066.71417: sending task result for task 12b410aa-8751-3c74-8f8e-00000000008a 10215 1727204066.71482: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000008a 10215 1727204066.71484: WORKER PROCESS EXITING skipping: [managed-node3] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 10215 1727204066.71539: no more pending results, returning what we have 10215 1727204066.71544: results queue empty 10215 1727204066.71545: checking for any_errors_fatal 10215 1727204066.71553: done checking for any_errors_fatal 10215 1727204066.71554: checking for max_fail_percentage 10215 1727204066.71557: done checking for max_fail_percentage 10215 1727204066.71557: checking to see if all hosts have failed and the running result is not ok 10215 1727204066.71558: done checking to see if all hosts have failed 10215 1727204066.71559: getting the remaining hosts for this loop 10215 1727204066.71561: done getting the remaining hosts for this loop 10215 1727204066.71565: getting the next task for host managed-node3 10215 1727204066.71572: done getting next task for host managed-node3 10215 1727204066.71576: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10215 1727204066.71580: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204066.71602: getting variables 10215 1727204066.71604: in VariableManager get_vars() 10215 1727204066.71648: Calling all_inventory to load vars for managed-node3 10215 1727204066.71652: Calling groups_inventory to load vars for managed-node3 10215 1727204066.71655: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204066.71667: Calling all_plugins_play to load vars for managed-node3 10215 1727204066.71671: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204066.71675: Calling groups_plugins_play to load vars for managed-node3 10215 1727204066.76171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204066.82195: done with get_vars() 10215 1727204066.82242: done getting variables 10215 1727204066.82526: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.146) 0:00:35.392 ***** 10215 1727204066.82570: entering _queue_task() for managed-node3/copy 10215 1727204066.83377: worker is 1 (out of 1 available) 10215 1727204066.83596: exiting _queue_task() for managed-node3/copy 10215 1727204066.83613: done queuing things up, now waiting for results queue to drain 10215 1727204066.83615: waiting for pending results... 10215 1727204066.83946: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 10215 1727204066.84319: in run() - task 12b410aa-8751-3c74-8f8e-00000000008b 10215 1727204066.84482: variable 'ansible_search_path' from source: unknown 10215 1727204066.84487: variable 'ansible_search_path' from source: unknown 10215 1727204066.84491: calling self._execute() 10215 1727204066.84738: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204066.84779: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204066.84801: variable 'omit' from source: magic vars 10215 1727204066.85722: variable 'ansible_distribution_major_version' from source: facts 10215 1727204066.85920: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204066.86134: variable 'network_provider' from source: set_fact 10215 1727204066.86147: Evaluated conditional (network_provider == "initscripts"): False 10215 1727204066.86156: when evaluation is False, skipping this task 10215 1727204066.86163: _execute() done 10215 1727204066.86198: dumping result to json 10215 1727204066.86435: done dumping result, returning 10215 1727204066.86440: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-3c74-8f8e-00000000008b] 10215 1727204066.86443: sending task result for task 12b410aa-8751-3c74-8f8e-00000000008b 10215 1727204066.86525: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000008b 10215 1727204066.86534: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10215 1727204066.86592: no more pending results, returning what we have 10215 1727204066.86597: results queue empty 10215 1727204066.86598: checking for any_errors_fatal 10215 1727204066.86604: done checking for any_errors_fatal 10215 1727204066.86604: checking for max_fail_percentage 10215 1727204066.86609: done checking for max_fail_percentage 10215 1727204066.86610: checking to see if all hosts have failed and the running result is not ok 10215 1727204066.86611: done checking to see if all hosts have failed 10215 1727204066.86612: getting the remaining hosts for this loop 10215 1727204066.86614: done getting the remaining hosts for this loop 10215 1727204066.86618: getting the next task for host managed-node3 10215 1727204066.86627: done getting next task for host managed-node3 10215 1727204066.86631: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10215 1727204066.86636: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204066.86659: getting variables 10215 1727204066.86662: in VariableManager get_vars() 10215 1727204066.87011: Calling all_inventory to load vars for managed-node3 10215 1727204066.87015: Calling groups_inventory to load vars for managed-node3 10215 1727204066.87018: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204066.87030: Calling all_plugins_play to load vars for managed-node3 10215 1727204066.87034: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204066.87038: Calling groups_plugins_play to load vars for managed-node3 10215 1727204066.91446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204066.97488: done with get_vars() 10215 1727204066.97539: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:26 -0400 (0:00:00.152) 0:00:35.545 ***** 10215 1727204066.97858: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 10215 1727204066.98635: worker is 1 (out of 1 available) 10215 1727204066.98650: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_connections 10215 1727204066.98664: done queuing things up, now waiting for results queue to drain 10215 1727204066.98665: waiting for pending results... 10215 1727204066.99108: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 10215 1727204066.99496: in run() - task 12b410aa-8751-3c74-8f8e-00000000008c 10215 1727204066.99500: variable 'ansible_search_path' from source: unknown 10215 1727204066.99503: variable 'ansible_search_path' from source: unknown 10215 1727204066.99506: calling self._execute() 10215 1727204066.99719: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204066.99733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204066.99749: variable 'omit' from source: magic vars 10215 1727204067.00567: variable 'ansible_distribution_major_version' from source: facts 10215 1727204067.00587: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204067.00895: variable 'omit' from source: magic vars 10215 1727204067.00898: variable 'omit' from source: magic vars 10215 1727204067.01085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 10215 1727204067.06675: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 10215 1727204067.07097: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 10215 1727204067.07101: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 10215 1727204067.07103: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 10215 1727204067.07105: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 10215 1727204067.07391: variable 'network_provider' from source: set_fact 10215 1727204067.07683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 10215 1727204067.07729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 10215 1727204067.07768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 10215 1727204067.07828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 10215 1727204067.07857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 10215 1727204067.07960: variable 'omit' from source: magic vars 10215 1727204067.08132: variable 'omit' from source: magic vars 10215 1727204067.08271: variable 'network_connections' from source: task vars 10215 1727204067.08295: variable 'port2_profile' from source: play vars 10215 1727204067.08373: variable 'port2_profile' from source: play vars 10215 1727204067.08396: variable 'port1_profile' from source: play vars 10215 1727204067.08472: variable 'port1_profile' from source: play vars 10215 1727204067.08486: variable 'controller_profile' from source: play vars 10215 1727204067.08564: variable 'controller_profile' from source: play vars 10215 1727204067.08826: variable 'omit' from source: magic vars 10215 1727204067.08842: variable '__lsr_ansible_managed' from source: task vars 10215 1727204067.08941: variable '__lsr_ansible_managed' from source: task vars 10215 1727204067.09177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 10215 1727204067.09461: Loaded config def from plugin (lookup/template) 10215 1727204067.09472: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 10215 1727204067.09514: File lookup term: get_ansible_managed.j2 10215 1727204067.09588: variable 'ansible_search_path' from source: unknown 10215 1727204067.09592: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 10215 1727204067.09599: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 10215 1727204067.09602: variable 'ansible_search_path' from source: unknown 10215 1727204067.22969: variable 'ansible_managed' from source: unknown 10215 1727204067.23209: variable 'omit' from source: magic vars 10215 1727204067.23249: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204067.23319: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204067.23322: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204067.23347: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204067.23364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204067.23405: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204067.23416: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204067.23495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204067.23565: Set connection var ansible_connection to ssh 10215 1727204067.23579: Set connection var ansible_pipelining to False 10215 1727204067.23595: Set connection var ansible_shell_type to sh 10215 1727204067.23609: Set connection var ansible_timeout to 10 10215 1727204067.23623: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204067.23644: Set connection var ansible_shell_executable to /bin/sh 10215 1727204067.23675: variable 'ansible_shell_executable' from source: unknown 10215 1727204067.23685: variable 'ansible_connection' from source: unknown 10215 1727204067.23696: variable 'ansible_module_compression' from source: unknown 10215 1727204067.23705: variable 'ansible_shell_type' from source: unknown 10215 1727204067.23714: variable 'ansible_shell_executable' from source: unknown 10215 1727204067.23722: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204067.23752: variable 'ansible_pipelining' from source: unknown 10215 1727204067.23755: variable 'ansible_timeout' from source: unknown 10215 1727204067.23767: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204067.23972: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204067.23975: variable 'omit' from source: magic vars 10215 1727204067.23978: starting attempt loop 10215 1727204067.23980: running the handler 10215 1727204067.23988: _low_level_execute_command(): starting 10215 1727204067.24004: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204067.24810: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204067.24877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204067.24896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204067.24929: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204067.25071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204067.26824: stdout chunk (state=3): >>>/root <<< 10215 1727204067.26870: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204067.27196: stderr chunk (state=3): >>><<< 10215 1727204067.27199: stdout chunk (state=3): >>><<< 10215 1727204067.27203: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204067.27206: _low_level_execute_command(): starting 10215 1727204067.27209: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516 `" && echo ansible-tmp-1727204067.271545-12114-221886665611516="` echo /root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516 `" ) && sleep 0' 10215 1727204067.28487: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204067.28608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204067.28625: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204067.28826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204067.28844: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204067.29063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204067.31196: stdout chunk (state=3): >>>ansible-tmp-1727204067.271545-12114-221886665611516=/root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516 <<< 10215 1727204067.31295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204067.31322: stdout chunk (state=3): >>><<< 10215 1727204067.31325: stderr chunk (state=3): >>><<< 10215 1727204067.31344: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204067.271545-12114-221886665611516=/root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204067.31596: variable 'ansible_module_compression' from source: unknown 10215 1727204067.31599: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 10215 1727204067.31896: variable 'ansible_facts' from source: unknown 10215 1727204067.31899: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516/AnsiballZ_network_connections.py 10215 1727204067.32207: Sending initial data 10215 1727204067.32223: Sent initial data (167 bytes) 10215 1727204067.33646: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204067.33704: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10215 1727204067.33751: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204067.33868: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204067.33906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204067.35578: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204067.35741: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204067.35745: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmp5ho1cxnc /root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516/AnsiballZ_network_connections.py <<< 10215 1727204067.35751: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516/AnsiballZ_network_connections.py" <<< 10215 1727204067.35818: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmp5ho1cxnc" to remote "/root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516/AnsiballZ_network_connections.py" <<< 10215 1727204067.39379: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204067.39600: stderr chunk (state=3): >>><<< 10215 1727204067.39604: stdout chunk (state=3): >>><<< 10215 1727204067.39637: done transferring module to remote 10215 1727204067.39652: _low_level_execute_command(): starting 10215 1727204067.39656: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516/ /root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516/AnsiballZ_network_connections.py && sleep 0' 10215 1727204067.41209: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204067.41307: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204067.41423: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204067.41444: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204067.41553: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204067.43626: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204067.43630: stdout chunk (state=3): >>><<< 10215 1727204067.43633: stderr chunk (state=3): >>><<< 10215 1727204067.43636: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204067.43639: _low_level_execute_command(): starting 10215 1727204067.43641: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516/AnsiballZ_network_connections.py && sleep 0' 10215 1727204067.44698: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204067.44703: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204067.44706: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10215 1727204067.44709: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10215 1727204067.44730: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204067.44774: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204067.44830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204068.06074: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4538ax0x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4538ax0x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/bfcd0732-4903-4a21-9526-3b74a99394ee: error=unknown <<< 10215 1727204068.07859: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4538ax0x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4538ax0x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail <<< 10215 1727204068.07886: stdout chunk (state=3): >>>ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/0f7546ed-ab57-4407-aded-d224151d9f1f: error=unknown <<< 10215 1727204068.09712: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4538ax0x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 10215 1727204068.09733: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4538ax0x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/38ec32a3-a35b-422a-b696-8ced2fcaef41: error=unknown <<< 10215 1727204068.09977: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 10215 1727204068.12070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204068.12129: stderr chunk (state=3): >>><<< 10215 1727204068.12134: stdout chunk (state=3): >>><<< 10215 1727204068.12153: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4538ax0x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4538ax0x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/bfcd0732-4903-4a21-9526-3b74a99394ee: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4538ax0x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4538ax0x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/0f7546ed-ab57-4407-aded-d224151d9f1f: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4538ax0x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_4538ax0x/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/38ec32a3-a35b-422a-b696-8ced2fcaef41: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204068.12237: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204068.12246: _low_level_execute_command(): starting 10215 1727204068.12250: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204067.271545-12114-221886665611516/ > /dev/null 2>&1 && sleep 0' 10215 1727204068.12700: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204068.12704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204068.12707: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204068.12711: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204068.12713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204068.12759: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204068.12766: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204068.12804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204068.14729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204068.14779: stderr chunk (state=3): >>><<< 10215 1727204068.14782: stdout chunk (state=3): >>><<< 10215 1727204068.14799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204068.14807: handler run complete 10215 1727204068.14837: attempt loop complete, returning result 10215 1727204068.14840: _execute() done 10215 1727204068.14842: dumping result to json 10215 1727204068.14852: done dumping result, returning 10215 1727204068.14866: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-3c74-8f8e-00000000008c] 10215 1727204068.14871: sending task result for task 12b410aa-8751-3c74-8f8e-00000000008c 10215 1727204068.14993: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000008c 10215 1727204068.14996: WORKER PROCESS EXITING changed: [managed-node3] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 10215 1727204068.15145: no more pending results, returning what we have 10215 1727204068.15149: results queue empty 10215 1727204068.15151: checking for any_errors_fatal 10215 1727204068.15157: done checking for any_errors_fatal 10215 1727204068.15158: checking for max_fail_percentage 10215 1727204068.15160: done checking for max_fail_percentage 10215 1727204068.15161: checking to see if all hosts have failed and the running result is not ok 10215 1727204068.15162: done checking to see if all hosts have failed 10215 1727204068.15163: getting the remaining hosts for this loop 10215 1727204068.15164: done getting the remaining hosts for this loop 10215 1727204068.15169: getting the next task for host managed-node3 10215 1727204068.15176: done getting next task for host managed-node3 10215 1727204068.15180: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 10215 1727204068.15184: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204068.15204: getting variables 10215 1727204068.15206: in VariableManager get_vars() 10215 1727204068.15247: Calling all_inventory to load vars for managed-node3 10215 1727204068.15251: Calling groups_inventory to load vars for managed-node3 10215 1727204068.15253: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204068.15264: Calling all_plugins_play to load vars for managed-node3 10215 1727204068.15267: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204068.15270: Calling groups_plugins_play to load vars for managed-node3 10215 1727204068.16599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204068.18190: done with get_vars() 10215 1727204068.18216: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:28 -0400 (0:00:01.204) 0:00:36.750 ***** 10215 1727204068.18295: entering _queue_task() for managed-node3/fedora.linux_system_roles.network_state 10215 1727204068.18572: worker is 1 (out of 1 available) 10215 1727204068.18590: exiting _queue_task() for managed-node3/fedora.linux_system_roles.network_state 10215 1727204068.18603: done queuing things up, now waiting for results queue to drain 10215 1727204068.18605: waiting for pending results... 10215 1727204068.18807: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state 10215 1727204068.18949: in run() - task 12b410aa-8751-3c74-8f8e-00000000008d 10215 1727204068.18957: variable 'ansible_search_path' from source: unknown 10215 1727204068.18961: variable 'ansible_search_path' from source: unknown 10215 1727204068.18996: calling self._execute() 10215 1727204068.19079: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204068.19086: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204068.19098: variable 'omit' from source: magic vars 10215 1727204068.19493: variable 'ansible_distribution_major_version' from source: facts 10215 1727204068.19696: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204068.19700: variable 'network_state' from source: role '' defaults 10215 1727204068.19703: Evaluated conditional (network_state != {}): False 10215 1727204068.19705: when evaluation is False, skipping this task 10215 1727204068.19710: _execute() done 10215 1727204068.19712: dumping result to json 10215 1727204068.19715: done dumping result, returning 10215 1727204068.19729: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-3c74-8f8e-00000000008d] 10215 1727204068.19742: sending task result for task 12b410aa-8751-3c74-8f8e-00000000008d skipping: [managed-node3] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 10215 1727204068.19927: no more pending results, returning what we have 10215 1727204068.19934: results queue empty 10215 1727204068.19936: checking for any_errors_fatal 10215 1727204068.19958: done checking for any_errors_fatal 10215 1727204068.19959: checking for max_fail_percentage 10215 1727204068.19961: done checking for max_fail_percentage 10215 1727204068.19963: checking to see if all hosts have failed and the running result is not ok 10215 1727204068.19964: done checking to see if all hosts have failed 10215 1727204068.19965: getting the remaining hosts for this loop 10215 1727204068.19966: done getting the remaining hosts for this loop 10215 1727204068.19971: getting the next task for host managed-node3 10215 1727204068.19980: done getting next task for host managed-node3 10215 1727204068.19985: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10215 1727204068.19991: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204068.20019: getting variables 10215 1727204068.20021: in VariableManager get_vars() 10215 1727204068.20070: Calling all_inventory to load vars for managed-node3 10215 1727204068.20073: Calling groups_inventory to load vars for managed-node3 10215 1727204068.20077: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204068.20450: Calling all_plugins_play to load vars for managed-node3 10215 1727204068.20456: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204068.20463: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000008d 10215 1727204068.20466: WORKER PROCESS EXITING 10215 1727204068.20471: Calling groups_plugins_play to load vars for managed-node3 10215 1727204068.21823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204068.24141: done with get_vars() 10215 1727204068.24173: done getting variables 10215 1727204068.24227: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.059) 0:00:36.809 ***** 10215 1727204068.24260: entering _queue_task() for managed-node3/debug 10215 1727204068.24538: worker is 1 (out of 1 available) 10215 1727204068.24554: exiting _queue_task() for managed-node3/debug 10215 1727204068.24568: done queuing things up, now waiting for results queue to drain 10215 1727204068.24570: waiting for pending results... 10215 1727204068.24776: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 10215 1727204068.24895: in run() - task 12b410aa-8751-3c74-8f8e-00000000008e 10215 1727204068.24910: variable 'ansible_search_path' from source: unknown 10215 1727204068.24915: variable 'ansible_search_path' from source: unknown 10215 1727204068.24952: calling self._execute() 10215 1727204068.25040: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204068.25047: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204068.25057: variable 'omit' from source: magic vars 10215 1727204068.25391: variable 'ansible_distribution_major_version' from source: facts 10215 1727204068.25403: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204068.25412: variable 'omit' from source: magic vars 10215 1727204068.25479: variable 'omit' from source: magic vars 10215 1727204068.25515: variable 'omit' from source: magic vars 10215 1727204068.25556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204068.25591: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204068.25609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204068.25629: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204068.25640: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204068.25669: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204068.25673: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204068.25677: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204068.25762: Set connection var ansible_connection to ssh 10215 1727204068.25769: Set connection var ansible_pipelining to False 10215 1727204068.25776: Set connection var ansible_shell_type to sh 10215 1727204068.25782: Set connection var ansible_timeout to 10 10215 1727204068.25794: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204068.25803: Set connection var ansible_shell_executable to /bin/sh 10215 1727204068.25842: variable 'ansible_shell_executable' from source: unknown 10215 1727204068.25846: variable 'ansible_connection' from source: unknown 10215 1727204068.25895: variable 'ansible_module_compression' from source: unknown 10215 1727204068.25898: variable 'ansible_shell_type' from source: unknown 10215 1727204068.25901: variable 'ansible_shell_executable' from source: unknown 10215 1727204068.25903: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204068.25905: variable 'ansible_pipelining' from source: unknown 10215 1727204068.25909: variable 'ansible_timeout' from source: unknown 10215 1727204068.25911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204068.26060: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204068.26081: variable 'omit' from source: magic vars 10215 1727204068.26094: starting attempt loop 10215 1727204068.26203: running the handler 10215 1727204068.26269: variable '__network_connections_result' from source: set_fact 10215 1727204068.26339: handler run complete 10215 1727204068.26367: attempt loop complete, returning result 10215 1727204068.26375: _execute() done 10215 1727204068.26383: dumping result to json 10215 1727204068.26395: done dumping result, returning 10215 1727204068.26413: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-3c74-8f8e-00000000008e] 10215 1727204068.26429: sending task result for task 12b410aa-8751-3c74-8f8e-00000000008e ok: [managed-node3] => { "__network_connections_result.stderr_lines": [ "" ] } 10215 1727204068.26624: no more pending results, returning what we have 10215 1727204068.26629: results queue empty 10215 1727204068.26630: checking for any_errors_fatal 10215 1727204068.26636: done checking for any_errors_fatal 10215 1727204068.26637: checking for max_fail_percentage 10215 1727204068.26640: done checking for max_fail_percentage 10215 1727204068.26641: checking to see if all hosts have failed and the running result is not ok 10215 1727204068.26642: done checking to see if all hosts have failed 10215 1727204068.26643: getting the remaining hosts for this loop 10215 1727204068.26645: done getting the remaining hosts for this loop 10215 1727204068.26650: getting the next task for host managed-node3 10215 1727204068.26658: done getting next task for host managed-node3 10215 1727204068.26662: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10215 1727204068.26667: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204068.26681: getting variables 10215 1727204068.26683: in VariableManager get_vars() 10215 1727204068.26730: Calling all_inventory to load vars for managed-node3 10215 1727204068.26733: Calling groups_inventory to load vars for managed-node3 10215 1727204068.26736: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204068.26749: Calling all_plugins_play to load vars for managed-node3 10215 1727204068.26752: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204068.26756: Calling groups_plugins_play to load vars for managed-node3 10215 1727204068.27351: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000008e 10215 1727204068.27355: WORKER PROCESS EXITING 10215 1727204068.29044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204068.32065: done with get_vars() 10215 1727204068.32109: done getting variables 10215 1727204068.32186: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.079) 0:00:36.889 ***** 10215 1727204068.32233: entering _queue_task() for managed-node3/debug 10215 1727204068.32630: worker is 1 (out of 1 available) 10215 1727204068.32644: exiting _queue_task() for managed-node3/debug 10215 1727204068.32656: done queuing things up, now waiting for results queue to drain 10215 1727204068.32658: waiting for pending results... 10215 1727204068.32984: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 10215 1727204068.33178: in run() - task 12b410aa-8751-3c74-8f8e-00000000008f 10215 1727204068.33205: variable 'ansible_search_path' from source: unknown 10215 1727204068.33219: variable 'ansible_search_path' from source: unknown 10215 1727204068.33268: calling self._execute() 10215 1727204068.33387: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204068.33406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204068.33428: variable 'omit' from source: magic vars 10215 1727204068.33905: variable 'ansible_distribution_major_version' from source: facts 10215 1727204068.33928: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204068.33941: variable 'omit' from source: magic vars 10215 1727204068.34050: variable 'omit' from source: magic vars 10215 1727204068.34294: variable 'omit' from source: magic vars 10215 1727204068.34298: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204068.34301: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204068.34303: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204068.34305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204068.34310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204068.34313: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204068.34322: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204068.34331: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204068.34462: Set connection var ansible_connection to ssh 10215 1727204068.34476: Set connection var ansible_pipelining to False 10215 1727204068.34487: Set connection var ansible_shell_type to sh 10215 1727204068.34501: Set connection var ansible_timeout to 10 10215 1727204068.34516: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204068.34536: Set connection var ansible_shell_executable to /bin/sh 10215 1727204068.34565: variable 'ansible_shell_executable' from source: unknown 10215 1727204068.34574: variable 'ansible_connection' from source: unknown 10215 1727204068.34582: variable 'ansible_module_compression' from source: unknown 10215 1727204068.34591: variable 'ansible_shell_type' from source: unknown 10215 1727204068.34599: variable 'ansible_shell_executable' from source: unknown 10215 1727204068.34606: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204068.34618: variable 'ansible_pipelining' from source: unknown 10215 1727204068.34625: variable 'ansible_timeout' from source: unknown 10215 1727204068.34634: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204068.34818: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204068.34838: variable 'omit' from source: magic vars 10215 1727204068.34849: starting attempt loop 10215 1727204068.34861: running the handler 10215 1727204068.34927: variable '__network_connections_result' from source: set_fact 10215 1727204068.35033: variable '__network_connections_result' from source: set_fact 10215 1727204068.35219: handler run complete 10215 1727204068.35263: attempt loop complete, returning result 10215 1727204068.35272: _execute() done 10215 1727204068.35295: dumping result to json 10215 1727204068.35299: done dumping result, returning 10215 1727204068.35395: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-3c74-8f8e-00000000008f] 10215 1727204068.35398: sending task result for task 12b410aa-8751-3c74-8f8e-00000000008f 10215 1727204068.35480: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000008f 10215 1727204068.35484: WORKER PROCESS EXITING ok: [managed-node3] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 10215 1727204068.35615: no more pending results, returning what we have 10215 1727204068.35620: results queue empty 10215 1727204068.35622: checking for any_errors_fatal 10215 1727204068.35631: done checking for any_errors_fatal 10215 1727204068.35632: checking for max_fail_percentage 10215 1727204068.35635: done checking for max_fail_percentage 10215 1727204068.35636: checking to see if all hosts have failed and the running result is not ok 10215 1727204068.35637: done checking to see if all hosts have failed 10215 1727204068.35638: getting the remaining hosts for this loop 10215 1727204068.35640: done getting the remaining hosts for this loop 10215 1727204068.35645: getting the next task for host managed-node3 10215 1727204068.35654: done getting next task for host managed-node3 10215 1727204068.35659: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10215 1727204068.35664: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204068.35679: getting variables 10215 1727204068.35681: in VariableManager get_vars() 10215 1727204068.35934: Calling all_inventory to load vars for managed-node3 10215 1727204068.35938: Calling groups_inventory to load vars for managed-node3 10215 1727204068.35942: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204068.35955: Calling all_plugins_play to load vars for managed-node3 10215 1727204068.35966: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204068.35971: Calling groups_plugins_play to load vars for managed-node3 10215 1727204068.38305: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204068.41262: done with get_vars() 10215 1727204068.41301: done getting variables 10215 1727204068.41374: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.091) 0:00:36.981 ***** 10215 1727204068.41420: entering _queue_task() for managed-node3/debug 10215 1727204068.41846: worker is 1 (out of 1 available) 10215 1727204068.41861: exiting _queue_task() for managed-node3/debug 10215 1727204068.41874: done queuing things up, now waiting for results queue to drain 10215 1727204068.41876: waiting for pending results... 10215 1727204068.42328: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 10215 1727204068.42336: in run() - task 12b410aa-8751-3c74-8f8e-000000000090 10215 1727204068.42359: variable 'ansible_search_path' from source: unknown 10215 1727204068.42367: variable 'ansible_search_path' from source: unknown 10215 1727204068.42416: calling self._execute() 10215 1727204068.42536: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204068.42554: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204068.42573: variable 'omit' from source: magic vars 10215 1727204068.43033: variable 'ansible_distribution_major_version' from source: facts 10215 1727204068.43050: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204068.43395: variable 'network_state' from source: role '' defaults 10215 1727204068.43399: Evaluated conditional (network_state != {}): False 10215 1727204068.43402: when evaluation is False, skipping this task 10215 1727204068.43404: _execute() done 10215 1727204068.43409: dumping result to json 10215 1727204068.43411: done dumping result, returning 10215 1727204068.43414: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-3c74-8f8e-000000000090] 10215 1727204068.43416: sending task result for task 12b410aa-8751-3c74-8f8e-000000000090 10215 1727204068.43495: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000090 10215 1727204068.43499: WORKER PROCESS EXITING skipping: [managed-node3] => { "false_condition": "network_state != {}" } 10215 1727204068.43556: no more pending results, returning what we have 10215 1727204068.43561: results queue empty 10215 1727204068.43563: checking for any_errors_fatal 10215 1727204068.43574: done checking for any_errors_fatal 10215 1727204068.43575: checking for max_fail_percentage 10215 1727204068.43577: done checking for max_fail_percentage 10215 1727204068.43579: checking to see if all hosts have failed and the running result is not ok 10215 1727204068.43580: done checking to see if all hosts have failed 10215 1727204068.43581: getting the remaining hosts for this loop 10215 1727204068.43583: done getting the remaining hosts for this loop 10215 1727204068.43591: getting the next task for host managed-node3 10215 1727204068.43599: done getting next task for host managed-node3 10215 1727204068.43605: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 10215 1727204068.43613: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204068.43638: getting variables 10215 1727204068.43641: in VariableManager get_vars() 10215 1727204068.43688: Calling all_inventory to load vars for managed-node3 10215 1727204068.43894: Calling groups_inventory to load vars for managed-node3 10215 1727204068.43898: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204068.43912: Calling all_plugins_play to load vars for managed-node3 10215 1727204068.43916: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204068.43920: Calling groups_plugins_play to load vars for managed-node3 10215 1727204068.48451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204068.54928: done with get_vars() 10215 1727204068.54977: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:28 -0400 (0:00:00.136) 0:00:37.118 ***** 10215 1727204068.55111: entering _queue_task() for managed-node3/ping 10215 1727204068.55536: worker is 1 (out of 1 available) 10215 1727204068.55548: exiting _queue_task() for managed-node3/ping 10215 1727204068.55561: done queuing things up, now waiting for results queue to drain 10215 1727204068.55563: waiting for pending results... 10215 1727204068.55914: running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity 10215 1727204068.56035: in run() - task 12b410aa-8751-3c74-8f8e-000000000091 10215 1727204068.56057: variable 'ansible_search_path' from source: unknown 10215 1727204068.56065: variable 'ansible_search_path' from source: unknown 10215 1727204068.56117: calling self._execute() 10215 1727204068.56230: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204068.56244: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204068.56260: variable 'omit' from source: magic vars 10215 1727204068.56716: variable 'ansible_distribution_major_version' from source: facts 10215 1727204068.56736: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204068.56750: variable 'omit' from source: magic vars 10215 1727204068.56848: variable 'omit' from source: magic vars 10215 1727204068.56905: variable 'omit' from source: magic vars 10215 1727204068.56960: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204068.57015: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204068.57042: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204068.57070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204068.57209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204068.57214: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204068.57216: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204068.57219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204068.57284: Set connection var ansible_connection to ssh 10215 1727204068.57300: Set connection var ansible_pipelining to False 10215 1727204068.57319: Set connection var ansible_shell_type to sh 10215 1727204068.57332: Set connection var ansible_timeout to 10 10215 1727204068.57343: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204068.57358: Set connection var ansible_shell_executable to /bin/sh 10215 1727204068.57393: variable 'ansible_shell_executable' from source: unknown 10215 1727204068.57403: variable 'ansible_connection' from source: unknown 10215 1727204068.57414: variable 'ansible_module_compression' from source: unknown 10215 1727204068.57426: variable 'ansible_shell_type' from source: unknown 10215 1727204068.57433: variable 'ansible_shell_executable' from source: unknown 10215 1727204068.57441: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204068.57449: variable 'ansible_pipelining' from source: unknown 10215 1727204068.57455: variable 'ansible_timeout' from source: unknown 10215 1727204068.57464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204068.57736: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 10215 1727204068.57763: variable 'omit' from source: magic vars 10215 1727204068.57774: starting attempt loop 10215 1727204068.57781: running the handler 10215 1727204068.57805: _low_level_execute_command(): starting 10215 1727204068.57821: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204068.58611: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204068.58634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204068.58713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204068.58773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204068.58793: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204068.58818: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204068.58950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204068.60698: stdout chunk (state=3): >>>/root <<< 10215 1727204068.60915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204068.60919: stdout chunk (state=3): >>><<< 10215 1727204068.60922: stderr chunk (state=3): >>><<< 10215 1727204068.60945: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204068.60965: _low_level_execute_command(): starting 10215 1727204068.60977: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762 `" && echo ansible-tmp-1727204068.609524-12174-32864274129762="` echo /root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762 `" ) && sleep 0' 10215 1727204068.61644: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204068.61660: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204068.61715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204068.61742: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204068.61836: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204068.61869: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204068.61946: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204068.63965: stdout chunk (state=3): >>>ansible-tmp-1727204068.609524-12174-32864274129762=/root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762 <<< 10215 1727204068.64105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204068.64222: stderr chunk (state=3): >>><<< 10215 1727204068.64238: stdout chunk (state=3): >>><<< 10215 1727204068.64261: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204068.609524-12174-32864274129762=/root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204068.64329: variable 'ansible_module_compression' from source: unknown 10215 1727204068.64394: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 10215 1727204068.64496: variable 'ansible_facts' from source: unknown 10215 1727204068.64531: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762/AnsiballZ_ping.py 10215 1727204068.64717: Sending initial data 10215 1727204068.64727: Sent initial data (151 bytes) 10215 1727204068.65411: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204068.65453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204068.65475: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204068.65506: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204068.65562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204068.67265: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204068.67270: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204068.67424: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpavziew7i /root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762/AnsiballZ_ping.py <<< 10215 1727204068.67429: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762/AnsiballZ_ping.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpavziew7i" to remote "/root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762/AnsiballZ_ping.py" <<< 10215 1727204068.68551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204068.68564: stdout chunk (state=3): >>><<< 10215 1727204068.68709: stderr chunk (state=3): >>><<< 10215 1727204068.68713: done transferring module to remote 10215 1727204068.68716: _low_level_execute_command(): starting 10215 1727204068.68718: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762/ /root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762/AnsiballZ_ping.py && sleep 0' 10215 1727204068.69296: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204068.69316: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204068.69338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204068.69361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204068.69380: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204068.69455: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204068.69511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204068.69530: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204068.69563: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204068.69635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204068.71523: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204068.71598: stderr chunk (state=3): >>><<< 10215 1727204068.71629: stdout chunk (state=3): >>><<< 10215 1727204068.71647: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204068.71658: _low_level_execute_command(): starting 10215 1727204068.71717: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762/AnsiballZ_ping.py && sleep 0' 10215 1727204068.72369: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204068.72391: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204068.72414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204068.72494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204068.72550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204068.72572: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204068.72601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204068.72674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204068.89857: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 10215 1727204068.91239: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204068.91331: stderr chunk (state=3): >>><<< 10215 1727204068.91372: stdout chunk (state=3): >>><<< 10215 1727204068.91538: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204068.91544: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204068.91547: _low_level_execute_command(): starting 10215 1727204068.91549: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204068.609524-12174-32864274129762/ > /dev/null 2>&1 && sleep 0' 10215 1727204068.92264: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204068.92305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204068.92624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204068.92909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204068.92932: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204068.93004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204068.94946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204068.95297: stderr chunk (state=3): >>><<< 10215 1727204068.95301: stdout chunk (state=3): >>><<< 10215 1727204068.95304: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204068.95315: handler run complete 10215 1727204068.95318: attempt loop complete, returning result 10215 1727204068.95320: _execute() done 10215 1727204068.95322: dumping result to json 10215 1727204068.95324: done dumping result, returning 10215 1727204068.95326: done running TaskExecutor() for managed-node3/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-3c74-8f8e-000000000091] 10215 1727204068.95329: sending task result for task 12b410aa-8751-3c74-8f8e-000000000091 10215 1727204068.95695: done sending task result for task 12b410aa-8751-3c74-8f8e-000000000091 10215 1727204068.95699: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "ping": "pong" } 10215 1727204068.95779: no more pending results, returning what we have 10215 1727204068.95784: results queue empty 10215 1727204068.95786: checking for any_errors_fatal 10215 1727204068.95797: done checking for any_errors_fatal 10215 1727204068.95798: checking for max_fail_percentage 10215 1727204068.95800: done checking for max_fail_percentage 10215 1727204068.95802: checking to see if all hosts have failed and the running result is not ok 10215 1727204068.95803: done checking to see if all hosts have failed 10215 1727204068.95804: getting the remaining hosts for this loop 10215 1727204068.95806: done getting the remaining hosts for this loop 10215 1727204068.95814: getting the next task for host managed-node3 10215 1727204068.95827: done getting next task for host managed-node3 10215 1727204068.95830: ^ task is: TASK: meta (role_complete) 10215 1727204068.95835: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204068.95851: getting variables 10215 1727204068.95853: in VariableManager get_vars() 10215 1727204068.96115: Calling all_inventory to load vars for managed-node3 10215 1727204068.96119: Calling groups_inventory to load vars for managed-node3 10215 1727204068.96123: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204068.96137: Calling all_plugins_play to load vars for managed-node3 10215 1727204068.96141: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204068.96145: Calling groups_plugins_play to load vars for managed-node3 10215 1727204068.98382: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204069.02054: done with get_vars() 10215 1727204069.02104: done getting variables 10215 1727204069.02223: done queuing things up, now waiting for results queue to drain 10215 1727204069.02226: results queue empty 10215 1727204069.02227: checking for any_errors_fatal 10215 1727204069.02232: done checking for any_errors_fatal 10215 1727204069.02233: checking for max_fail_percentage 10215 1727204069.02234: done checking for max_fail_percentage 10215 1727204069.02235: checking to see if all hosts have failed and the running result is not ok 10215 1727204069.02236: done checking to see if all hosts have failed 10215 1727204069.02237: getting the remaining hosts for this loop 10215 1727204069.02238: done getting the remaining hosts for this loop 10215 1727204069.02242: getting the next task for host managed-node3 10215 1727204069.02249: done getting next task for host managed-node3 10215 1727204069.02252: ^ task is: TASK: Delete the device '{{ controller_device }}' 10215 1727204069.02255: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204069.02261: getting variables 10215 1727204069.02262: in VariableManager get_vars() 10215 1727204069.02383: Calling all_inventory to load vars for managed-node3 10215 1727204069.02386: Calling groups_inventory to load vars for managed-node3 10215 1727204069.02392: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204069.02399: Calling all_plugins_play to load vars for managed-node3 10215 1727204069.02402: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204069.02406: Calling groups_plugins_play to load vars for managed-node3 10215 1727204069.04524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204069.07436: done with get_vars() 10215 1727204069.07482: done getting variables 10215 1727204069.07548: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 10215 1727204069.07697: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 Tuesday 24 September 2024 14:54:29 -0400 (0:00:00.526) 0:00:37.644 ***** 10215 1727204069.07735: entering _queue_task() for managed-node3/command 10215 1727204069.08427: worker is 1 (out of 1 available) 10215 1727204069.08440: exiting _queue_task() for managed-node3/command 10215 1727204069.08452: done queuing things up, now waiting for results queue to drain 10215 1727204069.08454: waiting for pending results... 10215 1727204069.08599: running TaskExecutor() for managed-node3/TASK: Delete the device 'nm-bond' 10215 1727204069.08744: in run() - task 12b410aa-8751-3c74-8f8e-0000000000c1 10215 1727204069.08769: variable 'ansible_search_path' from source: unknown 10215 1727204069.08828: calling self._execute() 10215 1727204069.08953: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204069.08970: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204069.08987: variable 'omit' from source: magic vars 10215 1727204069.09469: variable 'ansible_distribution_major_version' from source: facts 10215 1727204069.09553: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204069.09556: variable 'omit' from source: magic vars 10215 1727204069.09559: variable 'omit' from source: magic vars 10215 1727204069.09677: variable 'controller_device' from source: play vars 10215 1727204069.09711: variable 'omit' from source: magic vars 10215 1727204069.09769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204069.09827: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204069.09857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204069.09891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204069.09917: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204069.09996: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204069.10001: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204069.10004: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204069.10124: Set connection var ansible_connection to ssh 10215 1727204069.10139: Set connection var ansible_pipelining to False 10215 1727204069.10151: Set connection var ansible_shell_type to sh 10215 1727204069.10217: Set connection var ansible_timeout to 10 10215 1727204069.10221: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204069.10224: Set connection var ansible_shell_executable to /bin/sh 10215 1727204069.10233: variable 'ansible_shell_executable' from source: unknown 10215 1727204069.10242: variable 'ansible_connection' from source: unknown 10215 1727204069.10250: variable 'ansible_module_compression' from source: unknown 10215 1727204069.10258: variable 'ansible_shell_type' from source: unknown 10215 1727204069.10266: variable 'ansible_shell_executable' from source: unknown 10215 1727204069.10274: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204069.10283: variable 'ansible_pipelining' from source: unknown 10215 1727204069.10293: variable 'ansible_timeout' from source: unknown 10215 1727204069.10304: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204069.10498: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204069.10543: variable 'omit' from source: magic vars 10215 1727204069.10546: starting attempt loop 10215 1727204069.10549: running the handler 10215 1727204069.10565: _low_level_execute_command(): starting 10215 1727204069.10598: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204069.11496: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204069.11524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204069.11539: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204069.11617: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204069.13362: stdout chunk (state=3): >>>/root <<< 10215 1727204069.13554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204069.13673: stderr chunk (state=3): >>><<< 10215 1727204069.13703: stdout chunk (state=3): >>><<< 10215 1727204069.13787: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204069.13999: _low_level_execute_command(): starting 10215 1727204069.14003: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841 `" && echo ansible-tmp-1727204069.1379976-12196-11804687111841="` echo /root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841 `" ) && sleep 0' 10215 1727204069.15256: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204069.15267: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204069.15280: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204069.15300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204069.15318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204069.15351: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204069.15363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204069.15379: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10215 1727204069.15391: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 10215 1727204069.15402: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10215 1727204069.15617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 10215 1727204069.15635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204069.15652: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204069.15722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204069.18132: stdout chunk (state=3): >>>ansible-tmp-1727204069.1379976-12196-11804687111841=/root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841 <<< 10215 1727204069.18496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204069.18501: stdout chunk (state=3): >>><<< 10215 1727204069.18503: stderr chunk (state=3): >>><<< 10215 1727204069.18596: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204069.1379976-12196-11804687111841=/root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204069.18600: variable 'ansible_module_compression' from source: unknown 10215 1727204069.18895: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10215 1727204069.18899: variable 'ansible_facts' from source: unknown 10215 1727204069.18978: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841/AnsiballZ_command.py 10215 1727204069.19318: Sending initial data 10215 1727204069.19503: Sent initial data (155 bytes) 10215 1727204069.20479: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204069.20816: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204069.20838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204069.21067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204069.22713: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204069.22743: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204069.22782: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpwsjj1sgz /root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841/AnsiballZ_command.py <<< 10215 1727204069.22798: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841/AnsiballZ_command.py" <<< 10215 1727204069.22915: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpwsjj1sgz" to remote "/root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841/AnsiballZ_command.py" <<< 10215 1727204069.24841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204069.24997: stderr chunk (state=3): >>><<< 10215 1727204069.25001: stdout chunk (state=3): >>><<< 10215 1727204069.25003: done transferring module to remote 10215 1727204069.25009: _low_level_execute_command(): starting 10215 1727204069.25057: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841/ /root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841/AnsiballZ_command.py && sleep 0' 10215 1727204069.26367: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204069.26496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204069.26500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204069.26503: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 10215 1727204069.26505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204069.26515: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204069.26568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204069.26683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204069.26699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204069.26795: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204069.28774: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204069.28861: stderr chunk (state=3): >>><<< 10215 1727204069.28872: stdout chunk (state=3): >>><<< 10215 1727204069.28998: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204069.29001: _low_level_execute_command(): starting 10215 1727204069.29004: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841/AnsiballZ_command.py && sleep 0' 10215 1727204069.30179: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204069.30202: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204069.30217: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204069.30318: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204069.30485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204069.30541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204069.30565: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204069.48531: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:54:29.476650", "end": "2024-09-24 14:54:29.484221", "delta": "0:00:00.007571", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10215 1727204069.50068: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.10.90 closed. <<< 10215 1727204069.50127: stderr chunk (state=3): >>><<< 10215 1727204069.50131: stdout chunk (state=3): >>><<< 10215 1727204069.50148: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-24 14:54:29.476650", "end": "2024-09-24 14:54:29.484221", "delta": "0:00:00.007571", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.10.90 closed. 10215 1727204069.50194: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204069.50202: _low_level_execute_command(): starting 10215 1727204069.50211: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204069.1379976-12196-11804687111841/ > /dev/null 2>&1 && sleep 0' 10215 1727204069.50674: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204069.50679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204069.50720: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204069.50723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 10215 1727204069.50726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204069.50733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204069.50795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204069.50800: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204069.50802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204069.50838: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204069.52770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204069.52831: stderr chunk (state=3): >>><<< 10215 1727204069.52835: stdout chunk (state=3): >>><<< 10215 1727204069.52849: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204069.52857: handler run complete 10215 1727204069.52879: Evaluated conditional (False): False 10215 1727204069.52882: Evaluated conditional (False): False 10215 1727204069.52902: attempt loop complete, returning result 10215 1727204069.52905: _execute() done 10215 1727204069.52910: dumping result to json 10215 1727204069.52915: done dumping result, returning 10215 1727204069.52924: done running TaskExecutor() for managed-node3/TASK: Delete the device 'nm-bond' [12b410aa-8751-3c74-8f8e-0000000000c1] 10215 1727204069.52931: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000c1 10215 1727204069.53037: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000c1 10215 1727204069.53040: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007571", "end": "2024-09-24 14:54:29.484221", "failed_when_result": false, "rc": 1, "start": "2024-09-24 14:54:29.476650" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 10215 1727204069.53127: no more pending results, returning what we have 10215 1727204069.53131: results queue empty 10215 1727204069.53132: checking for any_errors_fatal 10215 1727204069.53135: done checking for any_errors_fatal 10215 1727204069.53136: checking for max_fail_percentage 10215 1727204069.53139: done checking for max_fail_percentage 10215 1727204069.53140: checking to see if all hosts have failed and the running result is not ok 10215 1727204069.53141: done checking to see if all hosts have failed 10215 1727204069.53142: getting the remaining hosts for this loop 10215 1727204069.53144: done getting the remaining hosts for this loop 10215 1727204069.53150: getting the next task for host managed-node3 10215 1727204069.53160: done getting next task for host managed-node3 10215 1727204069.53163: ^ task is: TASK: Remove test interfaces 10215 1727204069.53167: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204069.53172: getting variables 10215 1727204069.53173: in VariableManager get_vars() 10215 1727204069.53221: Calling all_inventory to load vars for managed-node3 10215 1727204069.53224: Calling groups_inventory to load vars for managed-node3 10215 1727204069.53229: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204069.53241: Calling all_plugins_play to load vars for managed-node3 10215 1727204069.53244: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204069.53247: Calling groups_plugins_play to load vars for managed-node3 10215 1727204069.60426: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204069.62008: done with get_vars() 10215 1727204069.62036: done getting variables 10215 1727204069.62081: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:29 -0400 (0:00:00.543) 0:00:38.188 ***** 10215 1727204069.62108: entering _queue_task() for managed-node3/shell 10215 1727204069.62397: worker is 1 (out of 1 available) 10215 1727204069.62417: exiting _queue_task() for managed-node3/shell 10215 1727204069.62431: done queuing things up, now waiting for results queue to drain 10215 1727204069.62434: waiting for pending results... 10215 1727204069.62630: running TaskExecutor() for managed-node3/TASK: Remove test interfaces 10215 1727204069.62749: in run() - task 12b410aa-8751-3c74-8f8e-0000000000c5 10215 1727204069.62761: variable 'ansible_search_path' from source: unknown 10215 1727204069.62767: variable 'ansible_search_path' from source: unknown 10215 1727204069.62805: calling self._execute() 10215 1727204069.62890: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204069.62900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204069.62909: variable 'omit' from source: magic vars 10215 1727204069.63249: variable 'ansible_distribution_major_version' from source: facts 10215 1727204069.63260: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204069.63267: variable 'omit' from source: magic vars 10215 1727204069.63323: variable 'omit' from source: magic vars 10215 1727204069.63458: variable 'dhcp_interface1' from source: play vars 10215 1727204069.63462: variable 'dhcp_interface2' from source: play vars 10215 1727204069.63481: variable 'omit' from source: magic vars 10215 1727204069.63524: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204069.63678: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204069.63682: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204069.63685: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204069.63688: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204069.63692: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204069.63695: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204069.63698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204069.63917: Set connection var ansible_connection to ssh 10215 1727204069.63921: Set connection var ansible_pipelining to False 10215 1727204069.63923: Set connection var ansible_shell_type to sh 10215 1727204069.63926: Set connection var ansible_timeout to 10 10215 1727204069.63929: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204069.63931: Set connection var ansible_shell_executable to /bin/sh 10215 1727204069.63934: variable 'ansible_shell_executable' from source: unknown 10215 1727204069.63936: variable 'ansible_connection' from source: unknown 10215 1727204069.63939: variable 'ansible_module_compression' from source: unknown 10215 1727204069.63941: variable 'ansible_shell_type' from source: unknown 10215 1727204069.63943: variable 'ansible_shell_executable' from source: unknown 10215 1727204069.63945: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204069.63947: variable 'ansible_pipelining' from source: unknown 10215 1727204069.63950: variable 'ansible_timeout' from source: unknown 10215 1727204069.63952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204069.64091: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204069.64095: variable 'omit' from source: magic vars 10215 1727204069.64098: starting attempt loop 10215 1727204069.64101: running the handler 10215 1727204069.64104: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204069.64184: _low_level_execute_command(): starting 10215 1727204069.64188: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204069.64844: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204069.64848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204069.64862: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204069.64872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204069.64888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204069.64896: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204069.64908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204069.64930: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10215 1727204069.64952: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 10215 1727204069.64975: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10215 1727204069.64991: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204069.65006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204069.65017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204069.65023: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204069.65097: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204069.65100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204069.65143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204069.66905: stdout chunk (state=3): >>>/root <<< 10215 1727204069.67099: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204069.67103: stdout chunk (state=3): >>><<< 10215 1727204069.67106: stderr chunk (state=3): >>><<< 10215 1727204069.67234: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204069.67238: _low_level_execute_command(): starting 10215 1727204069.67249: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049 `" && echo ansible-tmp-1727204069.67129-12223-233752726253049="` echo /root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049 `" ) && sleep 0' 10215 1727204069.67805: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204069.67834: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204069.67851: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204069.67871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204069.67887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204069.67942: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204069.67994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204069.68012: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204069.68068: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204069.68087: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204069.68141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204069.68186: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204069.70151: stdout chunk (state=3): >>>ansible-tmp-1727204069.67129-12223-233752726253049=/root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049 <<< 10215 1727204069.70313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204069.70317: stderr chunk (state=3): >>><<< 10215 1727204069.70323: stdout chunk (state=3): >>><<< 10215 1727204069.70345: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204069.67129-12223-233752726253049=/root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204069.70375: variable 'ansible_module_compression' from source: unknown 10215 1727204069.70422: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10215 1727204069.70458: variable 'ansible_facts' from source: unknown 10215 1727204069.70524: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049/AnsiballZ_command.py 10215 1727204069.70646: Sending initial data 10215 1727204069.70649: Sent initial data (154 bytes) 10215 1727204069.71164: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204069.71167: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204069.71170: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204069.71322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204069.71326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204069.71330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204069.72918: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 10215 1727204069.72923: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204069.72951: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204069.72987: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmpn1igne9w /root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049/AnsiballZ_command.py <<< 10215 1727204069.72992: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049/AnsiballZ_command.py" <<< 10215 1727204069.73020: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmpn1igne9w" to remote "/root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049/AnsiballZ_command.py" <<< 10215 1727204069.73796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204069.73867: stderr chunk (state=3): >>><<< 10215 1727204069.73876: stdout chunk (state=3): >>><<< 10215 1727204069.73904: done transferring module to remote 10215 1727204069.73918: _low_level_execute_command(): starting 10215 1727204069.73922: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049/ /root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049/AnsiballZ_command.py && sleep 0' 10215 1727204069.74535: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204069.74539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204069.74542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204069.74544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204069.74598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204069.74602: stderr chunk (state=3): >>>debug2: match not found <<< 10215 1727204069.74605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204069.74611: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 10215 1727204069.74614: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 10215 1727204069.74616: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 10215 1727204069.74618: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204069.74628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204069.74644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204069.74647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204069.74657: stderr chunk (state=3): >>>debug2: match found <<< 10215 1727204069.74668: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204069.74755: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204069.74774: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204069.74781: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204069.74846: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204069.76740: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204069.76805: stderr chunk (state=3): >>><<< 10215 1727204069.76809: stdout chunk (state=3): >>><<< 10215 1727204069.76830: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204069.76834: _low_level_execute_command(): starting 10215 1727204069.76839: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049/AnsiballZ_command.py && sleep 0' 10215 1727204069.77513: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204069.77568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204069.77591: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204069.77606: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204069.77693: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204069.98783: stdout chunk (state=3): >>> <<< 10215 1727204069.98788: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:54:29.950332", "end": "2024-09-24 14:54:29.985948", "delta": "0:00:00.035616", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10215 1727204070.00490: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204070.00558: stderr chunk (state=3): >>><<< 10215 1727204070.00563: stdout chunk (state=3): >>><<< 10215 1727204070.00580: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:54:29.950332", "end": "2024-09-24 14:54:29.985948", "delta": "0:00:00.035616", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204070.00625: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204070.00636: _low_level_execute_command(): starting 10215 1727204070.00644: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204069.67129-12223-233752726253049/ > /dev/null 2>&1 && sleep 0' 10215 1727204070.01136: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204070.01141: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204070.01144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.01148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204070.01150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204070.01153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.01200: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204070.01214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204070.01226: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204070.01253: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204070.03194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204070.03246: stderr chunk (state=3): >>><<< 10215 1727204070.03249: stdout chunk (state=3): >>><<< 10215 1727204070.03269: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204070.03273: handler run complete 10215 1727204070.03296: Evaluated conditional (False): False 10215 1727204070.03309: attempt loop complete, returning result 10215 1727204070.03312: _execute() done 10215 1727204070.03315: dumping result to json 10215 1727204070.03321: done dumping result, returning 10215 1727204070.03331: done running TaskExecutor() for managed-node3/TASK: Remove test interfaces [12b410aa-8751-3c74-8f8e-0000000000c5] 10215 1727204070.03337: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000c5 10215 1727204070.03445: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000c5 10215 1727204070.03448: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.035616", "end": "2024-09-24 14:54:29.985948", "rc": 0, "start": "2024-09-24 14:54:29.950332" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 10215 1727204070.03531: no more pending results, returning what we have 10215 1727204070.03535: results queue empty 10215 1727204070.03535: checking for any_errors_fatal 10215 1727204070.03548: done checking for any_errors_fatal 10215 1727204070.03549: checking for max_fail_percentage 10215 1727204070.03551: done checking for max_fail_percentage 10215 1727204070.03552: checking to see if all hosts have failed and the running result is not ok 10215 1727204070.03553: done checking to see if all hosts have failed 10215 1727204070.03554: getting the remaining hosts for this loop 10215 1727204070.03556: done getting the remaining hosts for this loop 10215 1727204070.03568: getting the next task for host managed-node3 10215 1727204070.03575: done getting next task for host managed-node3 10215 1727204070.03578: ^ task is: TASK: Stop dnsmasq/radvd services 10215 1727204070.03582: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204070.03587: getting variables 10215 1727204070.03588: in VariableManager get_vars() 10215 1727204070.03635: Calling all_inventory to load vars for managed-node3 10215 1727204070.03639: Calling groups_inventory to load vars for managed-node3 10215 1727204070.03642: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204070.03653: Calling all_plugins_play to load vars for managed-node3 10215 1727204070.03660: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204070.03664: Calling groups_plugins_play to load vars for managed-node3 10215 1727204070.05663: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204070.07907: done with get_vars() 10215 1727204070.07932: done getting variables 10215 1727204070.07986: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.459) 0:00:38.647 ***** 10215 1727204070.08018: entering _queue_task() for managed-node3/shell 10215 1727204070.08286: worker is 1 (out of 1 available) 10215 1727204070.08302: exiting _queue_task() for managed-node3/shell 10215 1727204070.08315: done queuing things up, now waiting for results queue to drain 10215 1727204070.08317: waiting for pending results... 10215 1727204070.08519: running TaskExecutor() for managed-node3/TASK: Stop dnsmasq/radvd services 10215 1727204070.08649: in run() - task 12b410aa-8751-3c74-8f8e-0000000000c6 10215 1727204070.08665: variable 'ansible_search_path' from source: unknown 10215 1727204070.08669: variable 'ansible_search_path' from source: unknown 10215 1727204070.08825: calling self._execute() 10215 1727204070.08829: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204070.08847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204070.08850: variable 'omit' from source: magic vars 10215 1727204070.09396: variable 'ansible_distribution_major_version' from source: facts 10215 1727204070.09402: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204070.09405: variable 'omit' from source: magic vars 10215 1727204070.09457: variable 'omit' from source: magic vars 10215 1727204070.09541: variable 'omit' from source: magic vars 10215 1727204070.09597: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204070.09717: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204070.09756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204070.09804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204070.09827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204070.09878: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204070.09885: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204070.09892: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204070.09977: Set connection var ansible_connection to ssh 10215 1727204070.09987: Set connection var ansible_pipelining to False 10215 1727204070.09995: Set connection var ansible_shell_type to sh 10215 1727204070.10002: Set connection var ansible_timeout to 10 10215 1727204070.10012: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204070.10018: Set connection var ansible_shell_executable to /bin/sh 10215 1727204070.10038: variable 'ansible_shell_executable' from source: unknown 10215 1727204070.10041: variable 'ansible_connection' from source: unknown 10215 1727204070.10050: variable 'ansible_module_compression' from source: unknown 10215 1727204070.10053: variable 'ansible_shell_type' from source: unknown 10215 1727204070.10056: variable 'ansible_shell_executable' from source: unknown 10215 1727204070.10058: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204070.10062: variable 'ansible_pipelining' from source: unknown 10215 1727204070.10065: variable 'ansible_timeout' from source: unknown 10215 1727204070.10079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204070.10195: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204070.10209: variable 'omit' from source: magic vars 10215 1727204070.10213: starting attempt loop 10215 1727204070.10216: running the handler 10215 1727204070.10227: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204070.10243: _low_level_execute_command(): starting 10215 1727204070.10250: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204070.10752: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204070.10787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.10794: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204070.10797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.10846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204070.10849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204070.10902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204070.12617: stdout chunk (state=3): >>>/root <<< 10215 1727204070.12723: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204070.12772: stderr chunk (state=3): >>><<< 10215 1727204070.12776: stdout chunk (state=3): >>><<< 10215 1727204070.12801: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204070.12814: _low_level_execute_command(): starting 10215 1727204070.12821: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184 `" && echo ansible-tmp-1727204070.1279922-12240-30891728828184="` echo /root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184 `" ) && sleep 0' 10215 1727204070.13274: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204070.13277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 <<< 10215 1727204070.13288: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 10215 1727204070.13302: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found <<< 10215 1727204070.13306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.13338: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204070.13342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204070.13384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204070.15351: stdout chunk (state=3): >>>ansible-tmp-1727204070.1279922-12240-30891728828184=/root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184 <<< 10215 1727204070.15468: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204070.15515: stderr chunk (state=3): >>><<< 10215 1727204070.15519: stdout chunk (state=3): >>><<< 10215 1727204070.15536: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204070.1279922-12240-30891728828184=/root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204070.15568: variable 'ansible_module_compression' from source: unknown 10215 1727204070.15613: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10215 1727204070.15645: variable 'ansible_facts' from source: unknown 10215 1727204070.15714: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184/AnsiballZ_command.py 10215 1727204070.15825: Sending initial data 10215 1727204070.15829: Sent initial data (155 bytes) 10215 1727204070.16248: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204070.16283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204070.16286: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204070.16293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.16296: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204070.16298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.16352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204070.16357: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204070.16394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204070.17979: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 10215 1727204070.17983: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204070.18013: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204070.18055: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmputrrqv7m /root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184/AnsiballZ_command.py <<< 10215 1727204070.18057: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184/AnsiballZ_command.py" <<< 10215 1727204070.18086: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmputrrqv7m" to remote "/root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184/AnsiballZ_command.py" <<< 10215 1727204070.18864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204070.18917: stderr chunk (state=3): >>><<< 10215 1727204070.18921: stdout chunk (state=3): >>><<< 10215 1727204070.19047: done transferring module to remote 10215 1727204070.19051: _low_level_execute_command(): starting 10215 1727204070.19054: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184/ /root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184/AnsiballZ_command.py && sleep 0' 10215 1727204070.19605: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 10215 1727204070.19610: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204070.19735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.19752: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.19778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204070.19798: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204070.19819: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204070.19887: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204070.21761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204070.21771: stderr chunk (state=3): >>><<< 10215 1727204070.21780: stdout chunk (state=3): >>><<< 10215 1727204070.21799: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204070.21803: _low_level_execute_command(): starting 10215 1727204070.21809: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184/AnsiballZ_command.py && sleep 0' 10215 1727204070.22522: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.22550: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204070.22596: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204070.22600: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204070.22661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204070.42893: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:54:30.398364", "end": "2024-09-24 14:54:30.427582", "delta": "0:00:00.029218", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10215 1727204070.44586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204070.44649: stderr chunk (state=3): >>><<< 10215 1727204070.44653: stdout chunk (state=3): >>><<< 10215 1727204070.44674: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:54:30.398364", "end": "2024-09-24 14:54:30.427582", "delta": "0:00:00.029218", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204070.44740: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204070.44744: _low_level_execute_command(): starting 10215 1727204070.44747: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204070.1279922-12240-30891728828184/ > /dev/null 2>&1 && sleep 0' 10215 1727204070.45239: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204070.45244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.45247: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204070.45249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.45312: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204070.45316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204070.45320: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204070.45357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204070.47286: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204070.47335: stderr chunk (state=3): >>><<< 10215 1727204070.47338: stdout chunk (state=3): >>><<< 10215 1727204070.47356: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204070.47364: handler run complete 10215 1727204070.47384: Evaluated conditional (False): False 10215 1727204070.47396: attempt loop complete, returning result 10215 1727204070.47399: _execute() done 10215 1727204070.47404: dumping result to json 10215 1727204070.47411: done dumping result, returning 10215 1727204070.47420: done running TaskExecutor() for managed-node3/TASK: Stop dnsmasq/radvd services [12b410aa-8751-3c74-8f8e-0000000000c6] 10215 1727204070.47426: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000c6 ok: [managed-node3] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.029218", "end": "2024-09-24 14:54:30.427582", "rc": 0, "start": "2024-09-24 14:54:30.398364" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 10215 1727204070.47630: no more pending results, returning what we have 10215 1727204070.47633: results queue empty 10215 1727204070.47635: checking for any_errors_fatal 10215 1727204070.47646: done checking for any_errors_fatal 10215 1727204070.47647: checking for max_fail_percentage 10215 1727204070.47649: done checking for max_fail_percentage 10215 1727204070.47650: checking to see if all hosts have failed and the running result is not ok 10215 1727204070.47651: done checking to see if all hosts have failed 10215 1727204070.47652: getting the remaining hosts for this loop 10215 1727204070.47654: done getting the remaining hosts for this loop 10215 1727204070.47658: getting the next task for host managed-node3 10215 1727204070.47668: done getting next task for host managed-node3 10215 1727204070.47671: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 10215 1727204070.47674: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204070.47679: getting variables 10215 1727204070.47681: in VariableManager get_vars() 10215 1727204070.47734: Calling all_inventory to load vars for managed-node3 10215 1727204070.47738: Calling groups_inventory to load vars for managed-node3 10215 1727204070.47741: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204070.47747: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000c6 10215 1727204070.47749: WORKER PROCESS EXITING 10215 1727204070.47760: Calling all_plugins_play to load vars for managed-node3 10215 1727204070.47763: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204070.47771: Calling groups_plugins_play to load vars for managed-node3 10215 1727204070.49147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204070.52098: done with get_vars() 10215 1727204070.52131: done getting variables 10215 1727204070.52181: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:120 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.441) 0:00:39.089 ***** 10215 1727204070.52210: entering _queue_task() for managed-node3/command 10215 1727204070.52471: worker is 1 (out of 1 available) 10215 1727204070.52487: exiting _queue_task() for managed-node3/command 10215 1727204070.52500: done queuing things up, now waiting for results queue to drain 10215 1727204070.52502: waiting for pending results... 10215 1727204070.52695: running TaskExecutor() for managed-node3/TASK: Restore the /etc/resolv.conf for initscript 10215 1727204070.52784: in run() - task 12b410aa-8751-3c74-8f8e-0000000000c7 10215 1727204070.52799: variable 'ansible_search_path' from source: unknown 10215 1727204070.52836: calling self._execute() 10215 1727204070.52926: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204070.52933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204070.52943: variable 'omit' from source: magic vars 10215 1727204070.53270: variable 'ansible_distribution_major_version' from source: facts 10215 1727204070.53282: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204070.53383: variable 'network_provider' from source: set_fact 10215 1727204070.53388: Evaluated conditional (network_provider == "initscripts"): False 10215 1727204070.53391: when evaluation is False, skipping this task 10215 1727204070.53405: _execute() done 10215 1727204070.53412: dumping result to json 10215 1727204070.53415: done dumping result, returning 10215 1727204070.53418: done running TaskExecutor() for managed-node3/TASK: Restore the /etc/resolv.conf for initscript [12b410aa-8751-3c74-8f8e-0000000000c7] 10215 1727204070.53420: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000c7 skipping: [managed-node3] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 10215 1727204070.53575: no more pending results, returning what we have 10215 1727204070.53579: results queue empty 10215 1727204070.53580: checking for any_errors_fatal 10215 1727204070.53593: done checking for any_errors_fatal 10215 1727204070.53594: checking for max_fail_percentage 10215 1727204070.53596: done checking for max_fail_percentage 10215 1727204070.53597: checking to see if all hosts have failed and the running result is not ok 10215 1727204070.53598: done checking to see if all hosts have failed 10215 1727204070.53599: getting the remaining hosts for this loop 10215 1727204070.53601: done getting the remaining hosts for this loop 10215 1727204070.53605: getting the next task for host managed-node3 10215 1727204070.53613: done getting next task for host managed-node3 10215 1727204070.53618: ^ task is: TASK: Verify network state restored to default 10215 1727204070.53622: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204070.53625: getting variables 10215 1727204070.53627: in VariableManager get_vars() 10215 1727204070.53664: Calling all_inventory to load vars for managed-node3 10215 1727204070.53667: Calling groups_inventory to load vars for managed-node3 10215 1727204070.53669: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204070.53680: Calling all_plugins_play to load vars for managed-node3 10215 1727204070.53683: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204070.53686: Calling groups_plugins_play to load vars for managed-node3 10215 1727204070.53705: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000c7 10215 1727204070.53711: WORKER PROCESS EXITING 10215 1727204070.55482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204070.57098: done with get_vars() 10215 1727204070.57130: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:125 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.050) 0:00:39.139 ***** 10215 1727204070.57271: entering _queue_task() for managed-node3/include_tasks 10215 1727204070.57801: worker is 1 (out of 1 available) 10215 1727204070.57813: exiting _queue_task() for managed-node3/include_tasks 10215 1727204070.57823: done queuing things up, now waiting for results queue to drain 10215 1727204070.57825: waiting for pending results... 10215 1727204070.58078: running TaskExecutor() for managed-node3/TASK: Verify network state restored to default 10215 1727204070.58393: in run() - task 12b410aa-8751-3c74-8f8e-0000000000c8 10215 1727204070.58398: variable 'ansible_search_path' from source: unknown 10215 1727204070.58482: calling self._execute() 10215 1727204070.58636: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204070.58645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204070.58666: variable 'omit' from source: magic vars 10215 1727204070.59195: variable 'ansible_distribution_major_version' from source: facts 10215 1727204070.59216: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204070.59223: _execute() done 10215 1727204070.59226: dumping result to json 10215 1727204070.59233: done dumping result, returning 10215 1727204070.59239: done running TaskExecutor() for managed-node3/TASK: Verify network state restored to default [12b410aa-8751-3c74-8f8e-0000000000c8] 10215 1727204070.59247: sending task result for task 12b410aa-8751-3c74-8f8e-0000000000c8 10215 1727204070.59402: no more pending results, returning what we have 10215 1727204070.59408: in VariableManager get_vars() 10215 1727204070.59468: Calling all_inventory to load vars for managed-node3 10215 1727204070.59472: Calling groups_inventory to load vars for managed-node3 10215 1727204070.59476: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204070.59495: Calling all_plugins_play to load vars for managed-node3 10215 1727204070.59500: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204070.59505: Calling groups_plugins_play to load vars for managed-node3 10215 1727204070.60106: done sending task result for task 12b410aa-8751-3c74-8f8e-0000000000c8 10215 1727204070.60113: WORKER PROCESS EXITING 10215 1727204070.63226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204070.65174: done with get_vars() 10215 1727204070.65196: variable 'ansible_search_path' from source: unknown 10215 1727204070.65210: we have included files to process 10215 1727204070.65211: generating all_blocks data 10215 1727204070.65215: done generating all_blocks data 10215 1727204070.65219: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 10215 1727204070.65220: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 10215 1727204070.65222: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 10215 1727204070.65558: done processing included file 10215 1727204070.65560: iterating over new_blocks loaded from include file 10215 1727204070.65561: in VariableManager get_vars() 10215 1727204070.65576: done with get_vars() 10215 1727204070.65578: filtering new block on tags 10215 1727204070.65613: done filtering new block on tags 10215 1727204070.65616: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node3 10215 1727204070.65620: extending task lists for all hosts with included blocks 10215 1727204070.67220: done extending task lists 10215 1727204070.67221: done processing included files 10215 1727204070.67222: results queue empty 10215 1727204070.67223: checking for any_errors_fatal 10215 1727204070.67225: done checking for any_errors_fatal 10215 1727204070.67226: checking for max_fail_percentage 10215 1727204070.67227: done checking for max_fail_percentage 10215 1727204070.67227: checking to see if all hosts have failed and the running result is not ok 10215 1727204070.67228: done checking to see if all hosts have failed 10215 1727204070.67229: getting the remaining hosts for this loop 10215 1727204070.67230: done getting the remaining hosts for this loop 10215 1727204070.67231: getting the next task for host managed-node3 10215 1727204070.67235: done getting next task for host managed-node3 10215 1727204070.67236: ^ task is: TASK: Check routes and DNS 10215 1727204070.67239: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204070.67241: getting variables 10215 1727204070.67241: in VariableManager get_vars() 10215 1727204070.67252: Calling all_inventory to load vars for managed-node3 10215 1727204070.67254: Calling groups_inventory to load vars for managed-node3 10215 1727204070.67255: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204070.67260: Calling all_plugins_play to load vars for managed-node3 10215 1727204070.67262: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204070.67264: Calling groups_plugins_play to load vars for managed-node3 10215 1727204070.68409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204070.69951: done with get_vars() 10215 1727204070.69971: done getting variables 10215 1727204070.70009: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:54:30 -0400 (0:00:00.127) 0:00:39.267 ***** 10215 1727204070.70035: entering _queue_task() for managed-node3/shell 10215 1727204070.70294: worker is 1 (out of 1 available) 10215 1727204070.70313: exiting _queue_task() for managed-node3/shell 10215 1727204070.70326: done queuing things up, now waiting for results queue to drain 10215 1727204070.70328: waiting for pending results... 10215 1727204070.70521: running TaskExecutor() for managed-node3/TASK: Check routes and DNS 10215 1727204070.70615: in run() - task 12b410aa-8751-3c74-8f8e-00000000056d 10215 1727204070.70626: variable 'ansible_search_path' from source: unknown 10215 1727204070.70631: variable 'ansible_search_path' from source: unknown 10215 1727204070.70667: calling self._execute() 10215 1727204070.70744: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204070.70751: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204070.70761: variable 'omit' from source: magic vars 10215 1727204070.71086: variable 'ansible_distribution_major_version' from source: facts 10215 1727204070.71099: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204070.71105: variable 'omit' from source: magic vars 10215 1727204070.71150: variable 'omit' from source: magic vars 10215 1727204070.71179: variable 'omit' from source: magic vars 10215 1727204070.71222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 10215 1727204070.71254: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 10215 1727204070.71272: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 10215 1727204070.71290: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204070.71304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 10215 1727204070.71334: variable 'inventory_hostname' from source: host vars for 'managed-node3' 10215 1727204070.71338: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204070.71340: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204070.71425: Set connection var ansible_connection to ssh 10215 1727204070.71432: Set connection var ansible_pipelining to False 10215 1727204070.71438: Set connection var ansible_shell_type to sh 10215 1727204070.71445: Set connection var ansible_timeout to 10 10215 1727204070.71458: Set connection var ansible_module_compression to ZIP_DEFLATED 10215 1727204070.71461: Set connection var ansible_shell_executable to /bin/sh 10215 1727204070.71480: variable 'ansible_shell_executable' from source: unknown 10215 1727204070.71483: variable 'ansible_connection' from source: unknown 10215 1727204070.71487: variable 'ansible_module_compression' from source: unknown 10215 1727204070.71491: variable 'ansible_shell_type' from source: unknown 10215 1727204070.71494: variable 'ansible_shell_executable' from source: unknown 10215 1727204070.71499: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204070.71505: variable 'ansible_pipelining' from source: unknown 10215 1727204070.71510: variable 'ansible_timeout' from source: unknown 10215 1727204070.71514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204070.71637: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204070.71648: variable 'omit' from source: magic vars 10215 1727204070.71654: starting attempt loop 10215 1727204070.71657: running the handler 10215 1727204070.71674: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 10215 1727204070.71686: _low_level_execute_command(): starting 10215 1727204070.71695: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 10215 1727204070.72234: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204070.72239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration <<< 10215 1727204070.72244: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.72301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204070.72305: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204070.72354: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204070.74112: stdout chunk (state=3): >>>/root <<< 10215 1727204070.74222: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204070.74273: stderr chunk (state=3): >>><<< 10215 1727204070.74276: stdout chunk (state=3): >>><<< 10215 1727204070.74301: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204070.74317: _low_level_execute_command(): starting 10215 1727204070.74323: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438 `" && echo ansible-tmp-1727204070.7430127-12269-251934135146438="` echo /root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438 `" ) && sleep 0' 10215 1727204070.74756: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204070.74794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204070.74797: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found <<< 10215 1727204070.74810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204070.74813: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.74865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204070.74868: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204070.74905: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204070.76939: stdout chunk (state=3): >>>ansible-tmp-1727204070.7430127-12269-251934135146438=/root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438 <<< 10215 1727204070.77060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204070.77109: stderr chunk (state=3): >>><<< 10215 1727204070.77113: stdout chunk (state=3): >>><<< 10215 1727204070.77131: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204070.7430127-12269-251934135146438=/root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204070.77161: variable 'ansible_module_compression' from source: unknown 10215 1727204070.77204: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-10215szdde4ay/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 10215 1727204070.77240: variable 'ansible_facts' from source: unknown 10215 1727204070.77306: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438/AnsiballZ_command.py 10215 1727204070.77416: Sending initial data 10215 1727204070.77420: Sent initial data (156 bytes) 10215 1727204070.77851: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204070.77894: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204070.77898: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.77900: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address <<< 10215 1727204070.77903: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204070.77906: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.77952: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204070.77956: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204070.77999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204070.79640: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 10215 1727204070.79645: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 10215 1727204070.79673: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 10215 1727204070.79712: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-10215szdde4ay/tmp71wmddmh /root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438/AnsiballZ_command.py <<< 10215 1727204070.79715: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438/AnsiballZ_command.py" <<< 10215 1727204070.79746: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-10215szdde4ay/tmp71wmddmh" to remote "/root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438/AnsiballZ_command.py" <<< 10215 1727204070.79750: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438/AnsiballZ_command.py" <<< 10215 1727204070.80522: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204070.80582: stderr chunk (state=3): >>><<< 10215 1727204070.80586: stdout chunk (state=3): >>><<< 10215 1727204070.80613: done transferring module to remote 10215 1727204070.80621: _low_level_execute_command(): starting 10215 1727204070.80626: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438/ /root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438/AnsiballZ_command.py && sleep 0' 10215 1727204070.81052: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204070.81061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 10215 1727204070.81086: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.81093: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.81158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 10215 1727204070.81161: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204070.81196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204070.83126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204070.83170: stderr chunk (state=3): >>><<< 10215 1727204070.83174: stdout chunk (state=3): >>><<< 10215 1727204070.83191: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204070.83194: _low_level_execute_command(): starting 10215 1727204070.83200: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438/AnsiballZ_command.py && sleep 0' 10215 1727204070.83654: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 10215 1727204070.83657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.83660: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204070.83662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204070.83712: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204070.83716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204070.83766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204071.02173: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:5e:c8:16:36:1d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.90/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3055sec preferred_lft 3055sec\n inet6 fe80::37d3:4e93:30d:de94/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.90 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.90 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:54:31.011396", "end": "2024-09-24 14:54:31.020370", "delta": "0:00:00.008974", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 10215 1727204071.03893: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. <<< 10215 1727204071.03953: stderr chunk (state=3): >>><<< 10215 1727204071.03958: stdout chunk (state=3): >>><<< 10215 1727204071.03976: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:5e:c8:16:36:1d brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.10.90/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3055sec preferred_lft 3055sec\n inet6 fe80::37d3:4e93:30d:de94/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.90 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.90 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:54:31.011396", "end": "2024-09-24 14:54:31.020370", "delta": "0:00:00.008974", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.10.90 closed. 10215 1727204071.04027: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 10215 1727204071.04036: _low_level_execute_command(): starting 10215 1727204071.04047: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204070.7430127-12269-251934135146438/ > /dev/null 2>&1 && sleep 0' 10215 1727204071.04494: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204071.04531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204071.04534: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 10215 1727204071.04537: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 10215 1727204071.04587: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 10215 1727204071.04597: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 10215 1727204071.04634: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 10215 1727204071.06521: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 10215 1727204071.06571: stderr chunk (state=3): >>><<< 10215 1727204071.06574: stdout chunk (state=3): >>><<< 10215 1727204071.06592: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.10.90 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.10.90 originally 10.31.10.90 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 10215 1727204071.06600: handler run complete 10215 1727204071.06624: Evaluated conditional (False): False 10215 1727204071.06635: attempt loop complete, returning result 10215 1727204071.06642: _execute() done 10215 1727204071.06644: dumping result to json 10215 1727204071.06654: done dumping result, returning 10215 1727204071.06663: done running TaskExecutor() for managed-node3/TASK: Check routes and DNS [12b410aa-8751-3c74-8f8e-00000000056d] 10215 1727204071.06669: sending task result for task 12b410aa-8751-3c74-8f8e-00000000056d 10215 1727204071.06788: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000056d 10215 1727204071.06793: WORKER PROCESS EXITING ok: [managed-node3] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008974", "end": "2024-09-24 14:54:31.020370", "rc": 0, "start": "2024-09-24 14:54:31.011396" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:5e:c8:16:36:1d brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.10.90/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3055sec preferred_lft 3055sec inet6 fe80::37d3:4e93:30d:de94/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.10.90 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.10.90 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 10215 1727204071.06891: no more pending results, returning what we have 10215 1727204071.06896: results queue empty 10215 1727204071.06897: checking for any_errors_fatal 10215 1727204071.06899: done checking for any_errors_fatal 10215 1727204071.06899: checking for max_fail_percentage 10215 1727204071.06901: done checking for max_fail_percentage 10215 1727204071.06904: checking to see if all hosts have failed and the running result is not ok 10215 1727204071.06905: done checking to see if all hosts have failed 10215 1727204071.06906: getting the remaining hosts for this loop 10215 1727204071.06909: done getting the remaining hosts for this loop 10215 1727204071.06914: getting the next task for host managed-node3 10215 1727204071.06922: done getting next task for host managed-node3 10215 1727204071.06925: ^ task is: TASK: Verify DNS and network connectivity 10215 1727204071.06929: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 10215 1727204071.06939: getting variables 10215 1727204071.06940: in VariableManager get_vars() 10215 1727204071.06982: Calling all_inventory to load vars for managed-node3 10215 1727204071.06985: Calling groups_inventory to load vars for managed-node3 10215 1727204071.06987: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204071.07006: Calling all_plugins_play to load vars for managed-node3 10215 1727204071.07012: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204071.07016: Calling groups_plugins_play to load vars for managed-node3 10215 1727204071.08349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204071.09939: done with get_vars() 10215 1727204071.09962: done getting variables 10215 1727204071.10017: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.400) 0:00:39.667 ***** 10215 1727204071.10046: entering _queue_task() for managed-node3/shell 10215 1727204071.10305: worker is 1 (out of 1 available) 10215 1727204071.10322: exiting _queue_task() for managed-node3/shell 10215 1727204071.10334: done queuing things up, now waiting for results queue to drain 10215 1727204071.10336: waiting for pending results... 10215 1727204071.10531: running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity 10215 1727204071.10620: in run() - task 12b410aa-8751-3c74-8f8e-00000000056e 10215 1727204071.10634: variable 'ansible_search_path' from source: unknown 10215 1727204071.10638: variable 'ansible_search_path' from source: unknown 10215 1727204071.10671: calling self._execute() 10215 1727204071.10756: variable 'ansible_host' from source: host vars for 'managed-node3' 10215 1727204071.10762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node3' 10215 1727204071.10773: variable 'omit' from source: magic vars 10215 1727204071.11095: variable 'ansible_distribution_major_version' from source: facts 10215 1727204071.11107: Evaluated conditional (ansible_distribution_major_version != '6'): True 10215 1727204071.11225: variable 'ansible_facts' from source: unknown 10215 1727204071.12265: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 10215 1727204071.12269: when evaluation is False, skipping this task 10215 1727204071.12272: _execute() done 10215 1727204071.12275: dumping result to json 10215 1727204071.12278: done dumping result, returning 10215 1727204071.12280: done running TaskExecutor() for managed-node3/TASK: Verify DNS and network connectivity [12b410aa-8751-3c74-8f8e-00000000056e] 10215 1727204071.12282: sending task result for task 12b410aa-8751-3c74-8f8e-00000000056e 10215 1727204071.12446: done sending task result for task 12b410aa-8751-3c74-8f8e-00000000056e 10215 1727204071.12449: WORKER PROCESS EXITING skipping: [managed-node3] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 10215 1727204071.12521: no more pending results, returning what we have 10215 1727204071.12525: results queue empty 10215 1727204071.12526: checking for any_errors_fatal 10215 1727204071.12536: done checking for any_errors_fatal 10215 1727204071.12537: checking for max_fail_percentage 10215 1727204071.12539: done checking for max_fail_percentage 10215 1727204071.12540: checking to see if all hosts have failed and the running result is not ok 10215 1727204071.12541: done checking to see if all hosts have failed 10215 1727204071.12541: getting the remaining hosts for this loop 10215 1727204071.12543: done getting the remaining hosts for this loop 10215 1727204071.12546: getting the next task for host managed-node3 10215 1727204071.12556: done getting next task for host managed-node3 10215 1727204071.12559: ^ task is: TASK: meta (flush_handlers) 10215 1727204071.12561: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204071.12565: getting variables 10215 1727204071.12566: in VariableManager get_vars() 10215 1727204071.12605: Calling all_inventory to load vars for managed-node3 10215 1727204071.12611: Calling groups_inventory to load vars for managed-node3 10215 1727204071.12614: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204071.12625: Calling all_plugins_play to load vars for managed-node3 10215 1727204071.12629: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204071.12632: Calling groups_plugins_play to load vars for managed-node3 10215 1727204071.13862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204071.15920: done with get_vars() 10215 1727204071.15962: done getting variables 10215 1727204071.16044: in VariableManager get_vars() 10215 1727204071.16061: Calling all_inventory to load vars for managed-node3 10215 1727204071.16064: Calling groups_inventory to load vars for managed-node3 10215 1727204071.16066: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204071.16072: Calling all_plugins_play to load vars for managed-node3 10215 1727204071.16074: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204071.16078: Calling groups_plugins_play to load vars for managed-node3 10215 1727204071.18091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204071.19928: done with get_vars() 10215 1727204071.19965: done queuing things up, now waiting for results queue to drain 10215 1727204071.19967: results queue empty 10215 1727204071.19968: checking for any_errors_fatal 10215 1727204071.19970: done checking for any_errors_fatal 10215 1727204071.19971: checking for max_fail_percentage 10215 1727204071.19972: done checking for max_fail_percentage 10215 1727204071.19972: checking to see if all hosts have failed and the running result is not ok 10215 1727204071.19973: done checking to see if all hosts have failed 10215 1727204071.19974: getting the remaining hosts for this loop 10215 1727204071.19974: done getting the remaining hosts for this loop 10215 1727204071.19977: getting the next task for host managed-node3 10215 1727204071.19980: done getting next task for host managed-node3 10215 1727204071.19981: ^ task is: TASK: meta (flush_handlers) 10215 1727204071.19983: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204071.19985: getting variables 10215 1727204071.19986: in VariableManager get_vars() 10215 1727204071.20000: Calling all_inventory to load vars for managed-node3 10215 1727204071.20002: Calling groups_inventory to load vars for managed-node3 10215 1727204071.20003: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204071.20009: Calling all_plugins_play to load vars for managed-node3 10215 1727204071.20011: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204071.20014: Calling groups_plugins_play to load vars for managed-node3 10215 1727204071.21523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204071.23867: done with get_vars() 10215 1727204071.23895: done getting variables 10215 1727204071.23943: in VariableManager get_vars() 10215 1727204071.23956: Calling all_inventory to load vars for managed-node3 10215 1727204071.23959: Calling groups_inventory to load vars for managed-node3 10215 1727204071.23961: Calling all_plugins_inventory to load vars for managed-node3 10215 1727204071.23966: Calling all_plugins_play to load vars for managed-node3 10215 1727204071.23968: Calling groups_plugins_inventory to load vars for managed-node3 10215 1727204071.23971: Calling groups_plugins_play to load vars for managed-node3 10215 1727204071.25718: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 10215 1727204071.28510: done with get_vars() 10215 1727204071.28804: done queuing things up, now waiting for results queue to drain 10215 1727204071.28807: results queue empty 10215 1727204071.28808: checking for any_errors_fatal 10215 1727204071.28810: done checking for any_errors_fatal 10215 1727204071.28811: checking for max_fail_percentage 10215 1727204071.28813: done checking for max_fail_percentage 10215 1727204071.28813: checking to see if all hosts have failed and the running result is not ok 10215 1727204071.28814: done checking to see if all hosts have failed 10215 1727204071.28815: getting the remaining hosts for this loop 10215 1727204071.28817: done getting the remaining hosts for this loop 10215 1727204071.28826: getting the next task for host managed-node3 10215 1727204071.28830: done getting next task for host managed-node3 10215 1727204071.28831: ^ task is: None 10215 1727204071.28833: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 10215 1727204071.28835: done queuing things up, now waiting for results queue to drain 10215 1727204071.28836: results queue empty 10215 1727204071.28837: checking for any_errors_fatal 10215 1727204071.28838: done checking for any_errors_fatal 10215 1727204071.28839: checking for max_fail_percentage 10215 1727204071.28840: done checking for max_fail_percentage 10215 1727204071.28841: checking to see if all hosts have failed and the running result is not ok 10215 1727204071.28842: done checking to see if all hosts have failed 10215 1727204071.28844: getting the next task for host managed-node3 10215 1727204071.28848: done getting next task for host managed-node3 10215 1727204071.28849: ^ task is: None 10215 1727204071.28852: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node3 : ok=75 changed=2 unreachable=0 failed=0 skipped=61 rescued=0 ignored=0 Tuesday 24 September 2024 14:54:31 -0400 (0:00:00.189) 0:00:39.857 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.32s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install dnsmasq --------------------------------------------------------- 2.20s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 2.19s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.84s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Install pgrep, sysctl --------------------------------------------------- 1.79s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Gathering Facts --------------------------------------------------------- 1.64s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 fedora.linux_system_roles.network : Check which packages are installed --- 1.32s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.30s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.20s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 1.16s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.09s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.88s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.71s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gather the minimum subset of ansible_facts required by the network role test --- 0.70s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Get NM profile info ----------------------------------------------------- 0.57s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 ** TEST check polling interval ------------------------------------------ 0.56s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:75 Check if system is ostree ----------------------------------------------- 0.56s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Delete the device 'nm-bond' --------------------------------------------- 0.54s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 Stat profile file ------------------------------------------------------- 0.53s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 10215 1727204071.29104: RUNNING CLEANUP